[JBoss JIRA] (WFWIP-187) Changes to PVC are not reflected in Operator
by Jeff Mesnil (Jira)
[ https://issues.jboss.org/browse/WFWIP-187?page=com.atlassian.jira.plugin.... ]
Jeff Mesnil commented on WFWIP-187:
-----------------------------------
I'm not sure there is an issue here and it seems to work as expected for PVC.
https://kubernetes.io/docs/tasks/run-application/delete-stateful-set/#per...:
Deleting the Pods in a StatefulSet will not delete the associated volumes. This is to ensure that you have the chance to copy data off the volume before deleting it. Deleting the PVC after the pods have left the terminating state might trigger deletion of the backing Persistent Volumes depending on the storage class and reclaim policy. You should never assume ability to access a volume after claim deletion.
> Changes to PVC are not reflected in Operator
> --------------------------------------------
>
> Key: WFWIP-187
> URL: https://issues.jboss.org/browse/WFWIP-187
> Project: WildFly WIP
> Issue Type: Bug
> Components: OpenShift
> Reporter: Martin Choma
> Assignee: Jeff Mesnil
> Priority: Blocker
> Labels: operator
>
> Any chnages (adding, removing or updating) made to PVC after WildFlyServer CR was created are not reflected in underlying PVC kubernetes object.
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years
[JBoss JIRA] (WFWIP-187) Changes to PVC are not reflected in Operator
by Jeff Mesnil (Jira)
[ https://issues.jboss.org/browse/WFWIP-187?page=com.atlassian.jira.plugin.... ]
Jeff Mesnil commented on WFWIP-187:
-----------------------------------
I'm using the metadata.generation to ensure that underlying Kubernetes resources (like StatefulSet) are up to date with the latest WildFlyServer generation.
The new annotation on StatefulSet tracks the *WildFlyServer* generation so that in the reconcile loop, if we notice that the statefulset is out of date, we update/delete it so that it complies to the latest WildFlyServer spec.
> Changes to PVC are not reflected in Operator
> --------------------------------------------
>
> Key: WFWIP-187
> URL: https://issues.jboss.org/browse/WFWIP-187
> Project: WildFly WIP
> Issue Type: Bug
> Components: OpenShift
> Reporter: Martin Choma
> Assignee: Jeff Mesnil
> Priority: Blocker
> Labels: operator
>
> Any chnages (adding, removing or updating) made to PVC after WildFlyServer CR was created are not reflected in underlying PVC kubernetes object.
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years
[JBoss JIRA] (WFLY-12624) Return hostname instead of IP address when generating default client mapping
by Richard Achmatowicz (Jira)
Richard Achmatowicz created WFLY-12624:
------------------------------------------
Summary: Return hostname instead of IP address when generating default client mapping
Key: WFLY-12624
URL: https://issues.jboss.org/browse/WFLY-12624
Project: WildFly
Issue Type: Bug
Components: Clustering
Affects Versions: 18.0.0.Beta1
Environment: An EJB client application interacting with a cluster of Wildfly server nodes
Reporter: Richard Achmatowicz
Assignee: Paul Ferraro
When an EJB client application interacts with a clustered Wildfly deployment, it receives topology updates from cluster nodes describing the membership of the cluster.
For each node in the cluster, a set of one or more client mappings is provided to indicate how the client may connect to the node, if it hasn't already. If the node is connected to a single network, there will be one client mapping; if the node is multi-homed and connected to two networks, there will be two client mappings, etc. Client mappings specify three things: a CIDR representation of the network the client may be on, a destination hostname or IP address and a destination port.
Client mappings may be generated by default (if none are provided in the server profile) or may be specified by the user via client mappings defined in the socket binding of the Remoting connector. For example:
{noformat}
<socket-binding name="remoting" port="1099">
<client-mapping source-network="135.121.1.29/16" destination-address="135.121.1.29" destination-port="1099"/>
</socketbinding>
{noformat}
When the client mapping information is received by the EJB client application, it is added to the discovered node registry (DNR) in the Discovery component of the EJB client. The DNR represents all known information about nodes with which the client can interact which was received from nodes in one or more clusters.
When an invocation is attempted, the Discovery component uses this information to generate a set of ServiceURLs which represent candidate targets (i.e. servers containing the deployment and module the client is invoking on) for the invocation. These service URLs use "an algorithm" to take the information in the DNR and convert that information to a corresponding set of ServiceURLs. The Discovery component will then select one such ServiceURL and return this as the target for the invocation.
The problem is that Discovery obtains its node information from various sources: client mappings associated with cluster nodes, as described above, as well as Remoting endpoints associated with established connections to nodes. These pieces of information describe at a minimum a host and a port.
The problem is that "the algorithm" used in Discovery to compute the set of ServiceURLs treats hostnames and IP addresses as simple strings. So "localhost" and "127.0.0.1" are treated as different hosts, even though they refer to the same host. This results in an incomplete/incorrect set of ServiceURLs being generated which in turn leads to incorrect Discovery failures.
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years
[JBoss JIRA] (WFWIP-211) emptyDir.sizeLimit not propagated to Pod Spec
by Jeff Mesnil (Jira)
[ https://issues.jboss.org/browse/WFWIP-211?page=com.atlassian.jira.plugin.... ]
Jeff Mesnil commented on WFWIP-211:
-----------------------------------
[~mchoma] the issue is *not* present when I test on Kubernetes with WildFly Operator.
The issue *is* present on OpenShift with EAP Operator.
I think this is an OpenShift issue that's not related to the EAP Operator.
If I create a simple pod with a memory limit of 5Mi, it seems it is not taken into account by OpenShift:
{code}
$ cat emptydir-test.yaml
apiVersion: v1
kind: Pod
metadata:
name: emptydir-test-pod
spec:
containers:
- image: alpine
imagePullPolicy: IfNotPresent
name: myvolumes-container
command: [ 'sh', '-c', 'echo The Bench Container 1 is Running ; sleep 3600']
volumeMounts:
- mountPath: /demo
name: demo-volume
volumes:
- name: demo-volume
emptyDir:
medium: Memory
sizeLimit: 5Mi
$ oc apply -f emptydir-test.yaml
pod/emptydir-test-pod created
$ oc get -o yaml pod emptydir-test-pod
...
volumes:
- emptyDir:
medium: Memory
name: demo-volume
- name: default-token-2qwn8
secret:
defaultMode: 420
secretName: default-token-2qwn8
...
{code}
I suppose OpenShift does not accept limit for memory emptyDir (or has a default one) but I could not find a definitive answer from the doc.
I'd suggest we close this issue as this is not something that is related to the operator. Do you agree?
> emptyDir.sizeLimit not propagated to Pod Spec
> ---------------------------------------------
>
> Key: WFWIP-211
> URL: https://issues.jboss.org/browse/WFWIP-211
> Project: WildFly WIP
> Issue Type: Bug
> Components: OpenShift
> Reporter: Martin Choma
> Assignee: Jeff Mesnil
> Priority: Blocker
> Labels: operator
>
> # {code:yaml}
> apiVersion: wildfly.org/v1alpha1
> kind: WildFlyServer
> metadata:
> name: operator-empty-dir
> namespace: mchoma
> spec:
> applicationImage: 'registry.access.redhat.com/jboss-eap-7/eap72-openshift:1.1'
> size: 1
> storage:
> emptyDir:
> medium: Memory
> sizeLimit: 1Mi
> {code}
> # wait until pod are started and look into pod yaml definition "1 Mi" is not there
> {code:yaml}
> ...
> serviceAccount: default
> serviceAccountName: default
> subdomain: operator-empty-dir-headless
> terminationGracePeriodSeconds: 30
> volumes:
> - emptyDir:
> medium: Memory
> name: operator-empty-dir-volume
> - name: default-token-j2grg
> secret:
> defaultMode: 420
> secretName: default-token-j2grg
> status:
> conditions:
> ...
> {code}
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years
[JBoss JIRA] (DROOLS-4481) [DMN Designer] Data Types - Business Central Data Objects as DMN Data Types UX
by Elizabeth Clayton (Jira)
[ https://issues.jboss.org/browse/DROOLS-4481?page=com.atlassian.jira.plugi... ]
Elizabeth Clayton resolved DROOLS-4481.
---------------------------------------
Resolution: Done
[~karreiro] [~tirelli] I updated the click-thru based on team feedback. The current mockup includes the suggested enhancements (inline alerts and import as collapsed), so I'm marking this as Resolved. Thanks!
> [DMN Designer] Data Types - Business Central Data Objects as DMN Data Types UX
> ------------------------------------------------------------------------------
>
> Key: DROOLS-4481
> URL: https://issues.jboss.org/browse/DROOLS-4481
> Project: Drools
> Issue Type: Task
> Components: DMN Editor
> Reporter: Guilherme Gomes
> Assignee: Elizabeth Clayton
> Priority: Major
> Labels: UX, UXTeam, drools-tools
> Attachments: 2019-08-26 17.48.30.gif, 2_introtext.png, Screen Shot 2019-08-26 at 18.11.01.png
>
>
> *Requirements*
> It should be possible to generate data types from imported data models (java classes).
> * As a user I want to be able to use data type definitions that are structured similar to Java data object models (classes) that I have defined externally.
> * As I user I want to be able to edit and update data objects that have been converted to data types, so that I can manually update the definitions.
> Import/convert data objects:
> * 5 levels deep can “introspect” and convert data model, beyond that the data type would be “any.”
> * Import (convert) only within the Data Type tab, this is not a feature of the Import/Include function as import DO’s is not supported in the DMN spec. --
> *Current scenario*
> Currently, users can create Data Objects on Business Central. See:
> !2019-08-26 17.48.30.gif|width=600!
> However users cannot re-use Data Objects as Data Types.
> ---
> *Description*
> Data Objects (DO) are pretty similar to DMN Data Types (DT). So, would be great to import the DO above as a DMN DT like the following one:
> !Screen Shot 2019-08-26 at 18.11.01.png|width=600!
> ---
> *Questions to clarify at requirement level*
> 1) Some DOs can be quite complex and some fields can be impossible to guess.
> - Person
> -- name (Some strange type)
> -- age (Integer)
> What should we do? Import name as "Any"? Remove the name field? Or block the Person type?
> 2) Do we need a specific component to import Data Objects as Data Types? Couldn't we just add Data Objects in the type dropdown, but use a different category (Default, Custom Data Types, Data Objects)?
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years
[JBoss JIRA] (DROOLS-4481) [DMN Designer] Data Types - Business Central Data Objects as DMN Data Types UX
by Elizabeth Clayton (Jira)
[ https://issues.jboss.org/browse/DROOLS-4481?page=com.atlassian.jira.plugi... ]
Elizabeth Clayton updated DROOLS-4481:
--------------------------------------
Attachment: 2_introtext.png
> [DMN Designer] Data Types - Business Central Data Objects as DMN Data Types UX
> ------------------------------------------------------------------------------
>
> Key: DROOLS-4481
> URL: https://issues.jboss.org/browse/DROOLS-4481
> Project: Drools
> Issue Type: Task
> Components: DMN Editor
> Reporter: Guilherme Gomes
> Assignee: Elizabeth Clayton
> Priority: Major
> Labels: UX, UXTeam, drools-tools
> Attachments: 2019-08-26 17.48.30.gif, 2_introtext.png, Screen Shot 2019-08-26 at 18.11.01.png
>
>
> *Requirements*
> It should be possible to generate data types from imported data models (java classes).
> * As a user I want to be able to use data type definitions that are structured similar to Java data object models (classes) that I have defined externally.
> * As I user I want to be able to edit and update data objects that have been converted to data types, so that I can manually update the definitions.
> Import/convert data objects:
> * 5 levels deep can “introspect” and convert data model, beyond that the data type would be “any.”
> * Import (convert) only within the Data Type tab, this is not a feature of the Import/Include function as import DO’s is not supported in the DMN spec. --
> *Current scenario*
> Currently, users can create Data Objects on Business Central. See:
> !2019-08-26 17.48.30.gif|width=600!
> However users cannot re-use Data Objects as Data Types.
> ---
> *Description*
> Data Objects (DO) are pretty similar to DMN Data Types (DT). So, would be great to import the DO above as a DMN DT like the following one:
> !Screen Shot 2019-08-26 at 18.11.01.png|width=600!
> ---
> *Questions to clarify at requirement level*
> 1) Some DOs can be quite complex and some fields can be impossible to guess.
> - Person
> -- name (Some strange type)
> -- age (Integer)
> What should we do? Import name as "Any"? Remove the name field? Or block the Person type?
> 2) Do we need a specific component to import Data Objects as Data Types? Couldn't we just add Data Objects in the type dropdown, but use a different category (Default, Custom Data Types, Data Objects)?
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years
[JBoss JIRA] (DROOLS-4481) [DMN Designer] Data Types - Business Central Data Objects as DMN Data Types UX
by Elizabeth Clayton (Jira)
[ https://issues.jboss.org/browse/DROOLS-4481?page=com.atlassian.jira.plugi... ]
Elizabeth Clayton commented on DROOLS-4481:
-------------------------------------------
[~karreiro] I updated the click-thru with:
* Inline alerts: Please note: the examples in my mocks are just roughs to show the general concept. The exact styling should match PatternFly 3 recommendations, to be consistent in the overall UI. See: https://www.patternfly.org/v3/pattern-library/communication/inline-notifi...
* Importing the DO in its collapsed state, and a screen to show the expanded view.
* It's not visible in the click-thru, but here's a version with inline help text at the top of the dialog. In my mockups I replaced the help text with the inline error. I think that might be okay, but if you think it's best to move the whole area down to support the alert that would be good. Here's a mockup with the intro text, though we might want [~stetson.robinson] to review it.
!2_introtext.png|thumbnail!
> [DMN Designer] Data Types - Business Central Data Objects as DMN Data Types UX
> ------------------------------------------------------------------------------
>
> Key: DROOLS-4481
> URL: https://issues.jboss.org/browse/DROOLS-4481
> Project: Drools
> Issue Type: Task
> Components: DMN Editor
> Reporter: Guilherme Gomes
> Assignee: Elizabeth Clayton
> Priority: Major
> Labels: UX, UXTeam, drools-tools
> Attachments: 2019-08-26 17.48.30.gif, Screen Shot 2019-08-26 at 18.11.01.png
>
>
> *Requirements*
> It should be possible to generate data types from imported data models (java classes).
> * As a user I want to be able to use data type definitions that are structured similar to Java data object models (classes) that I have defined externally.
> * As I user I want to be able to edit and update data objects that have been converted to data types, so that I can manually update the definitions.
> Import/convert data objects:
> * 5 levels deep can “introspect” and convert data model, beyond that the data type would be “any.”
> * Import (convert) only within the Data Type tab, this is not a feature of the Import/Include function as import DO’s is not supported in the DMN spec. --
> *Current scenario*
> Currently, users can create Data Objects on Business Central. See:
> !2019-08-26 17.48.30.gif|width=600!
> However users cannot re-use Data Objects as Data Types.
> ---
> *Description*
> Data Objects (DO) are pretty similar to DMN Data Types (DT). So, would be great to import the DO above as a DMN DT like the following one:
> !Screen Shot 2019-08-26 at 18.11.01.png|width=600!
> ---
> *Questions to clarify at requirement level*
> 1) Some DOs can be quite complex and some fields can be impossible to guess.
> - Person
> -- name (Some strange type)
> -- age (Integer)
> What should we do? Import name as "Any"? Remove the name field? Or block the Person type?
> 2) Do we need a specific component to import Data Objects as Data Types? Couldn't we just add Data Objects in the type dropdown, but use a different category (Default, Custom Data Types, Data Objects)?
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years
[JBoss JIRA] (DROOLS-4558) executable-model doesn't fully parse multi-line pattern
by Luca Molteni (Jira)
[ https://issues.jboss.org/browse/DROOLS-4558?page=com.atlassian.jira.plugi... ]
Luca Molteni updated DROOLS-4558:
---------------------------------
Sprint: 2019 Week 38-40 (from Sep 16)
> executable-model doesn't fully parse multi-line pattern
> --------------------------------------------------------
>
> Key: DROOLS-4558
> URL: https://issues.jboss.org/browse/DROOLS-4558
> Project: Drools
> Issue Type: Bug
> Components: executable model
> Affects Versions: 7.24.0.Final, 7.25.0.Final, 7.26.0.Final, 7.27.0.Final
> Environment: - executable-model
> Reporter: Toshiya Kobayashi
> Assignee: Luca Molteni
> Priority: Major
> Labels: support
>
> When a pattern has multiple lines, executable-model doesn't fully parse its conditions.
> {noformat}
> rule R1 when
> $p : Person(age == 30
> || employed == true)
> then
> end
> {noformat}
> Looking at the generated Java code, only the first line is parsed.
> It doesn't have a problem in 7.23.0.Final so it seems to be a regression after that.
--
This message was sent by Atlassian Jira
(v7.13.8#713008)
6 years