DNA SVN: r1532 - trunk/dna-integration-tests/src/test/resources/tck/jdbcmeta.
by dna-commits@lists.jboss.org
Author: bcarothers
Date: 2010-01-05 19:36:37 -0500 (Tue, 05 Jan 2010)
New Revision: 1532
Modified:
trunk/dna-integration-tests/src/test/resources/tck/jdbcmeta/repositoryOverlay.properties
Log:
DNA-623 JDBC Metadata TCK Test Failing in Nightly Integration Build
Now that the query tests are enabled, the TCK test is running XPathPosIndexTest against all of the connectors. This test has a written prerequisite (in Javadoc) that the root node have at least 3 child nodes named the same as the value of the "nodeName1" property. However, the test checks for a different precondition (more than three child nodes in total).
Since the JDBC Metadata connector doesn't support same-name siblings, it cannot meet the written prerequisite, but the configured root node did have more than 3 child nodes. So the test tried to execute, but the query (which ended in [2]) couldn't possibly return values from this connector.
The applied patch moves the root for the test cases to a node that only has two children, effectively bypassing this test[1]. This is actually better for most other tests since it moves the test root closer to the actual root of the repository and exposes more nodes to TCK testing. I tried to do this originally when I added the connector, but the performance was too slow.
[1] - This is why the FileSystemRepositoryTCKTest didn't also break. Although it doesn't support SNS either, the root node happened to only have two children so this test was effectively bypassed for that connector as well.
Modified: trunk/dna-integration-tests/src/test/resources/tck/jdbcmeta/repositoryOverlay.properties
===================================================================
--- trunk/dna-integration-tests/src/test/resources/tck/jdbcmeta/repositoryOverlay.properties 2010-01-05 20:33:47 UTC (rev 1531)
+++ trunk/dna-integration-tests/src/test/resources/tck/jdbcmeta/repositoryOverlay.properties 2010-01-06 00:36:37 UTC (rev 1532)
@@ -5,7 +5,7 @@
javax.jcr.tck.dnaNodeTypePath=/org/jboss/dna/connector/meta/jdbc/connector-metajdbc.cnd
# The fan-out from the tables node is HUGE in HSQLDB. This allows the tests to complete reasonably quickly.
-javax.jcr.tck.testroot=/default/INFORMATION_SCHEMA/tables/SYSTEM_ALIASES
+javax.jcr.tck.testroot=/default/INFORMATION_SCHEMA
# There's only one workspace per repository for this connector right now
javax.jcr.tck.workspacename=default
14 years, 4 months
DNA SVN: r1530 - in trunk: dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect and 12 other directories.
by dna-commits@lists.jboss.org
Author: blafond
Date: 2010-01-05 15:33:13 -0500 (Tue, 05 Jan 2010)
New Revision: 1530
Added:
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlIntegrationTestUtil.java
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdlSequencerIntegrationTest.java
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdlSequencerIntegrationTest.java
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdlSequencerIntegrationTest.java
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/derby_test_statements.ddl
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/oracle_test_statements.ddl
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/postgres_test_statements.ddl
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/grant_test_statements.ddl
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/revoke_test_statements.ddl
Removed:
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/OracleDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/derby_test_statements.ddl
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/oracle_test_statements.ddl
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/test_cnd.cnd
Modified:
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlConstants.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java
trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java
Log:
DNA-49 refactored ddl sequencer integration tests. Separated per dialect. Added standard revoke statement parsing.
Added: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlIntegrationTestUtil.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlIntegrationTestUtil.java (rev 0)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlIntegrationTestUtil.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,348 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.test.integration.sequencer.ddl;
+
+import java.io.IOException;
+import java.net.URL;
+import java.util.Calendar;
+import static org.hamcrest.core.Is.is;
+import static org.junit.Assert.assertThat;
+import static org.junit.Assert.fail;
+import javax.jcr.Node;
+import javax.jcr.NodeIterator;
+import javax.jcr.PathNotFoundException;
+import javax.jcr.Property;
+import javax.jcr.PropertyIterator;
+import javax.jcr.RepositoryException;
+import javax.jcr.Session;
+import javax.jcr.Value;
+import javax.jcr.ValueFormatException;
+import javax.jcr.nodetype.NodeType;
+import org.jboss.dna.graph.SecurityContext;
+import org.jboss.dna.jcr.JcrEngine;
+import org.jboss.dna.jcr.JcrTools;
+import org.jboss.dna.repository.sequencer.SequencingService;
+
+/**
+ *
+ */
+public class DdlIntegrationTestUtil {
+ public JcrEngine engine;
+ public Session session;
+ public JcrTools tools;
+ public static final String ddlTestResourceRootFolder = "org/jboss/dna/test/integration/sequencer/ddl/";
+
+
+ public void uploadFile(URL url) throws RepositoryException, IOException {
+ // Grab the last segment of the URL path, using it as the filename
+ String filename = url.getPath().replaceAll("([^/]*/)*", "");
+ String nodePath = "/a/b/" + filename;
+ String mimeType = "ddl";
+
+ // Now use the JCR API to upload the file ...
+
+
+ // Create the node at the supplied path ...
+ Node node = tools.findOrCreateNode(session.getRootNode(), nodePath, "nt:folder", "nt:file");
+
+ // Upload the file to that node ...
+ Node contentNode = tools.findOrCreateChild(node, "jcr:content", "nt:resource");
+ contentNode.setProperty("jcr:mimeType", mimeType);
+ contentNode.setProperty("jcr:lastModified", Calendar.getInstance());
+ contentNode.setProperty("jcr:data", url.openStream());
+
+ // Save the session ...
+ session.save();
+
+ }
+
+ /**
+ * Get the sequencing statistics.
+ *
+ * @return the statistics; never null
+ */
+ public SequencingService.Statistics getStatistics() {
+ return this.engine.getSequencingService().getStatistics();
+ }
+
+ public void waitUntilSequencedNodesIs( int totalNumberOfNodesSequenced ) throws InterruptedException {
+ // check 50 times, waiting 0.1 seconds between (for a total of 5 seconds max) ...
+ long numFound = 0;
+ for (int i = 0; i != 50; i++) {
+ numFound = getStatistics().getNumberOfNodesSequenced();
+ if (numFound >= totalNumberOfNodesSequenced) {
+ return;
+ }
+ Thread.sleep(1000);
+ }
+ fail("Expected to find " + totalNumberOfNodesSequenced + " nodes sequenced, but found " + numFound);
+ }
+
+
+ public class MyCustomSecurityContext implements SecurityContext {
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.SecurityContext#getUserName()
+ */
+ public String getUserName() {
+ return "Fred";
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.SecurityContext#hasRole(java.lang.String)
+ */
+ public boolean hasRole( String roleName ) {
+ return true;
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.SecurityContext#logout()
+ */
+ public void logout() {
+ // do something
+ }
+ }
+
+ public void verifyChildNode(Node parentNode, String childNodeName, String propName, String expectedValue) throws Exception {
+ // Find child node
+ Node childNode = null;
+ for (NodeIterator iter = parentNode.getNodes(); iter.hasNext();) {
+ Node nextNode = iter.nextNode();
+ if( nextNode.getName().equals(childNodeName)) {
+ childNode = nextNode;
+ break;
+ }
+ }
+ if( childNode != null ) {
+ assertThat( childNode.hasProperty(propName), is(true));
+ verifySingleValueProperty(childNode, propName, expectedValue);
+
+ } else {
+ fail("NODE: " + childNodeName + " not found");
+ }
+
+ }
+
+ public void verifyNode(Node topNode, String name, String propName) throws Exception {
+ Node node = findNode(topNode, name);
+
+ if( node != null ) {
+ assertThat( node.hasProperty(propName), is(true));
+ } else {
+ fail("NODE: " + name + " not found");
+ }
+
+ }
+
+ public void verifySimpleStringProperty(Node node, String propName, String expectedValue) throws Exception {
+ assertThat( node.hasProperty(propName), is(true));
+ verifySingleValueProperty(node, propName, expectedValue);
+ }
+
+ public void verifyNode(Node topNode, String name, String propName, String expectedValue) throws Exception {
+ Node node = findNode(topNode, name);
+
+ if( node != null ) {
+ assertThat( node.hasProperty(propName), is(true));
+ verifySingleValueProperty(node, propName, expectedValue);
+
+ } else {
+ fail("NODE: " + name + " not found");
+ }
+
+ }
+
+ public void verifyNode(Node topNode, String name, String propName, int expectedValue) throws Exception {
+ Node node = findNode(topNode, name);
+
+ if( node != null ) {
+ assertThat( node.hasProperty(propName), is(true));
+ verifySingleValueProperty(node, propName, expectedValue);
+
+ } else {
+ fail("NODE: " + name + " not found");
+ }
+
+ }
+
+ protected Value value( String value ) throws Exception {
+ return session.getValueFactory().createValue(value);
+ }
+
+ public void verifySingleValueProperty(Node node, String propNameStr, String expectedValue) throws Exception {
+ Value expValue = value(expectedValue);
+ Property prop = node.getProperty(propNameStr);
+ if( prop.getDefinition().isMultiple()) {
+ boolean hasValue = false;
+
+ Object[] values = prop.getValues();
+ for( Object val : values) {
+ if(val.equals(expValue)) {
+ hasValue = true;
+ }
+ }
+
+ assertThat(hasValue, is(true));
+ } else {
+ Object actualValue = prop.getValue();
+ assertThat(expValue, is(actualValue));
+ }
+
+ }
+
+
+ public void verifySingleValueProperty(Node node, String propNameStr, int expectedValue) throws Exception {
+ Property prop = node.getProperty(propNameStr);
+ Value expValue = session.getValueFactory().createValue(expectedValue);
+ Object actualValue = prop.getValue();
+ assertThat(expValue, is(actualValue));
+
+ }
+
+ public void verifyMixin(Node topNode, String nodeName, String nodeType) throws Exception {
+ Node node = findNode(topNode, nodeName);
+
+ if( node != null ) {
+ verifyMixin(node, nodeType);
+
+ } else {
+ fail("NODE: " + nodeName + " not found");
+ }
+ }
+
+ public boolean hasMixin(Node node, String nodeType) throws Exception {
+ for( NodeType mixin : node.getMixinNodeTypes() ) {
+ String mixinName = mixin.getName();
+ if( mixinName.equals(nodeType) ) {
+ return true;
+ }
+ }
+ return false;
+ }
+
+ public void verifyMixin(Node node, String nodeType) throws Exception {
+ boolean foundMixin = hasMixin(node, nodeType);
+
+
+ assertThat(foundMixin, is(true));
+ }
+
+ public void verifyNodeType(Node topNode, String nodeName, String nodeTypeName) throws Exception {
+ Node node = findNode(topNode, nodeName);
+
+ if( node != null ) {
+ assertThat(node.isNodeType(nodeTypeName), is(true));
+ } else {
+ fail("NODE: " + nodeName + " not found");
+ }
+
+ }
+
+ public void verifyNodeTypes(Node topNode, String nodeName, String nodeTypeName, String...moreNodeTypeNames) throws Exception {
+ Node node = findNode(topNode, nodeName);
+
+ if( node != null ) {
+ assertThat(node.isNodeType(nodeTypeName), is(true));
+ for( String nextTypeName : moreNodeTypeNames ) {
+ assertThat(node.isNodeType(nextTypeName), is(true));
+ }
+ } else {
+ fail("NODE: " + nodeName + " not found");
+ }
+
+ }
+
+ public Node findNode(Node node, String name) throws Exception {
+ if( node.getName().equals(name)) {
+ return node;
+ }
+ for (NodeIterator iter = node.getNodes(); iter.hasNext();) {
+ Node nextNode = iter.nextNode();
+ if( nextNode.getName().equals(name)) {
+ return nextNode;
+ }
+ Node someNode = findNode(nextNode, name);
+ if( someNode != null ) {
+ return someNode;
+ }
+ }
+
+ return null;
+ }
+
+ public Node findNode(Node node, String name, String type) throws Exception {
+ if( node.getName().equals(name) && node.isNodeType(type)) { //(hasMixin(node, type) || node.isNodeType(type))) {
+ return node;
+ }
+ for (NodeIterator iter = node.getNodes(); iter.hasNext();) {
+ Node nextNode = iter.nextNode();
+ //String nextNodeName = nextNode.getName();
+ //boolean isNodeType = nextNode.isNodeType(type);
+ if( nextNode.getName().equals(name) && nextNode.isNodeType(type)) { //nextNodeName.equals(name) && isNodeType) { //(hasMixin(node, type) || node.isNodeType(type))) {
+ return nextNode;
+ }
+ Node someNode = findNode(nextNode, name, type);
+ if( someNode != null ) {
+ return someNode;
+ }
+ }
+
+ return null;
+ }
+
+ public void printProperties( Node node ) throws RepositoryException, PathNotFoundException, ValueFormatException {
+
+ System.out.println("\n >>> NODE PATH: " + node.getPath() );
+ System.out.println(" NAME: " + node.getName() + "\n" );
+
+ // Create a Properties object containing the properties for this node; ignore any children ...
+ //Properties props = new PropMyCustomSecurityContexterties();
+ for (PropertyIterator propertyIter = node.getProperties(); propertyIter.hasNext();) {
+ Property property = propertyIter.nextProperty();
+ String name = property.getName();
+ String stringValue = null;
+ if (property.getDefinition().isMultiple()) {
+ StringBuilder sb = new StringBuilder();
+ boolean first = true;
+ for (Value value : property.getValues()) {
+ if (!first) {
+ sb.append(", ");
+ first = false;
+ }
+ sb.append(value.getString());
+ }
+ stringValue = sb.toString();
+ } else {
+ stringValue = property.getValue().getString();
+ }
+ System.out.println(" | PROP: " + name + " VALUE: " + stringValue);
+ //props.put(name, stringValue);
+ }
+ }
+}
Property changes on: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlIntegrationTestUtil.java
___________________________________________________________________
Name: svn:mime-type
+ text/plain
Modified: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -23,35 +23,16 @@
*/
package org.jboss.dna.test.integration.sequencer.ddl;
-import static org.junit.Assert.assertNotNull;
-import static org.hamcrest.core.Is.is;
import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertThat;
-import static org.junit.Assert.fail;
-import java.io.IOException;
+import static org.junit.Assert.assertNotNull;
import java.net.URL;
-import java.util.Calendar;
import javax.jcr.Node;
import javax.jcr.NodeIterator;
-import javax.jcr.PathNotFoundException;
-import javax.jcr.Property;
-import javax.jcr.PropertyIterator;
-import javax.jcr.RepositoryException;
-import javax.jcr.Session;
-import javax.jcr.Value;
-import javax.jcr.ValueFormatException;
-import javax.jcr.nodetype.NodeType;
-import org.jboss.dna.graph.SecurityContext;
import org.jboss.dna.graph.connector.inmemory.InMemoryRepositorySource;
import org.jboss.dna.jcr.JcrConfiguration;
-import org.jboss.dna.jcr.JcrEngine;
import org.jboss.dna.jcr.JcrTools;
import org.jboss.dna.jcr.SecurityContextCredentials;
-import org.jboss.dna.repository.sequencer.SequencingService;
import org.jboss.dna.sequencer.ddl.StandardDdlLexicon;
-import org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon;
-import org.jboss.dna.sequencer.ddl.dialect.oracle.OracleDdlLexicon;
-import org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
@@ -59,11 +40,11 @@
/**
*
*/
-public class DdlSequencerIntegrationTest {
- private JcrEngine engine;
- private Session session;
- private JcrTools tools;
- private static final String cndDdlFolder = "org/jboss/dna/test/integration/sequencer/ddl/";
+public class DdlSequencerIntegrationTest extends DdlIntegrationTestUtil {
+// private JcrEngine engine;
+// private Session session;
+// private JcrTools tools;
+// private static final String cndDdlFolder = "org/jboss/dna/test/integration/sequencer/ddl/";
@Before
public void beforeEach() throws Exception {
@@ -85,14 +66,8 @@
.setProperty("defaultWorkspaceName", workspaceName);
// Set up the JCR repository to use the source ...
config.repository(repositoryName)
- .addNodeTypes(getUrl(cndDdlFolder + "StandardDdl.cnd"))
- .addNodeTypes(getUrl(cndDdlFolder + "DerbyDdl.cnd"))
- .addNodeTypes(getUrl(cndDdlFolder + "OracleDdl.cnd"))
- .addNodeTypes(getUrl(cndDdlFolder + "PostgresDdl.cnd"))
+ .addNodeTypes(getUrl(ddlTestResourceRootFolder + "StandardDdl.cnd"))
.registerNamespace(StandardDdlLexicon.Namespace.PREFIX, StandardDdlLexicon.Namespace.URI)
- .registerNamespace(DerbyDdlLexicon.Namespace.PREFIX, DerbyDdlLexicon.Namespace.URI)
- .registerNamespace(OracleDdlLexicon.Namespace.PREFIX, OracleDdlLexicon.Namespace.URI)
- .registerNamespace(PostgresDdlLexicon.Namespace.PREFIX, PostgresDdlLexicon.Namespace.URI)
.setSource(repositorySource);
// Set up the DDL sequencer ...
config.sequencer("DDL Sequencer")
@@ -124,55 +99,10 @@
}
}
- private void uploadFile(URL url) throws RepositoryException, IOException {
- // Grab the last segment of the URL path, using it as the filename
- String filename = url.getPath().replaceAll("([^/]*/)*", "");
- String nodePath = "/a/b/" + filename;
- String mimeType = "ddl";
-
- // Now use the JCR API to upload the file ...
-
-
- // Create the node at the supplied path ...
- Node node = tools.findOrCreateNode(session.getRootNode(), nodePath, "nt:folder", "nt:file");
-
- // Upload the file to that node ...
- Node contentNode = tools.findOrCreateChild(node, "jcr:content", "nt:resource");
- contentNode.setProperty("jcr:mimeType", mimeType);
- contentNode.setProperty("jcr:lastModified", Calendar.getInstance());
- contentNode.setProperty("jcr:data", url.openStream());
-
- // Save the session ...
- session.save();
-
- }
-
- /**
- * Get the sequencing statistics.
- *
- * @return the statistics; never null
- */
- public SequencingService.Statistics getStatistics() {
- return this.engine.getSequencingService().getStatistics();
- }
-
- protected void waitUntilSequencedNodesIs( int totalNumberOfNodesSequenced ) throws InterruptedException {
- // check 50 times, waiting 0.1 seconds between (for a total of 5 seconds max) ...
- long numFound = 0;
- for (int i = 0; i != 50; i++) {
- numFound = getStatistics().getNumberOfNodesSequenced();
- if (numFound >= totalNumberOfNodesSequenced) {
- return;
- }
- Thread.sleep(1000);
- }
- fail("Expected to find " + totalNumberOfNodesSequenced + " nodes sequenced, but found " + numFound);
- }
-
@Test
public void shouldSequenceCreateSchemaDdlFile() throws Exception {
System.out.println("STARTED: shouldSequenceCreateSchemaDdlFile(create_schema.ddl)");
- URL url = getUrl(cndDdlFolder + "create_schema.ddl");
+ URL url = getUrl(ddlTestResourceRootFolder + "create_schema.ddl");
uploadFile(url);
waitUntilSequencedNodesIs(1);
@@ -202,7 +132,7 @@
@Test
public void shouldSequenceStandardDdlFile() throws Exception {
System.out.println("STARTED: shouldSequenceStandardDdlFile(standard_test_statements.ddl)");
- URL url = getUrl(cndDdlFolder + "standard_test_statements.ddl");
+ URL url = getUrl(ddlTestResourceRootFolder + "standard_test_statements.ddl");
uploadFile(url);
waitUntilSequencedNodesIs(1);
@@ -220,36 +150,18 @@
//printNodeProperties(ddlNode);
long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
- assertEquals(numStatements, 11);
-
- //GRANT SELECT ON TABLE purchaseOrders TO maria,harry;
- Node grantNode = findNode(ddlNode, "purchaseOrders", "ddl:grantOnTableStatement");
- assertNotNull(grantNode);
- Node granteeNode = findNode(grantNode, "maria", "ddl:grantee");
- assertNotNull(granteeNode);
- Node privNode = findNode(grantNode, "privilege", "ddl:grantPrivilege");
- assertNotNull(privNode);
- verifySingleValueProperty(privNode, "ddl:type", "SELECT");
-
- //GRANT UPDATE, USAGE ON TABLE purchaseOrders TO anita,zhi;
- grantNode = findNode(ddlNode, "billedOrders", "ddl:grantOnTableStatement");
- assertNotNull(grantNode);
- privNode = findNode(grantNode, "privilege", "ddl:grantPrivilege");
- assertNotNull(privNode);
- verifySingleValueProperty(privNode, "ddl:type", "UPDATE");
- granteeNode = findNode(grantNode, "anita", "ddl:grantee");
- assertNotNull(granteeNode);
+ assertEquals(numStatements, 7);
}
}
}
- System.out.println("FINISHED: shouldSequenceStandardDdlFile(create_schema.ddl)");
+ System.out.println("FINISHED: shouldSequenceStandardDdlFile(standard_test_statements.ddl)");
}
@Test
- public void shouldSequenceDerbyDdlFile() throws Exception {
- System.out.println("STARTED: shouldSequenceDerbyDdlFile(derby_test_statements.ddl)");
- URL url = getUrl(cndDdlFolder + "derby_test_statements.ddl");
+ public void shouldSequenceStandardDdlGrantStatements() throws Exception {
+ System.out.println("STARTED: shouldSequenceStandardDdlGrantStatements(grant_test_statements.ddl)");
+ URL url = getUrl(ddlTestResourceRootFolder + "grant_test_statements.ddl");
uploadFile(url);
waitUntilSequencedNodesIs(1);
@@ -263,110 +175,40 @@
//System.out.println(" | NAME: " + ddlsNode.getName() + " PATH: " + ddlsNode.getPath());
for (NodeIterator iter = ddlsNode.getNodes(); iter.hasNext();) {
Node ddlNode = iter.nextNode();
+
+ //printNodeProperties(ddlNode);
long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
- assertEquals(numStatements, 64);
+ assertEquals(numStatements, 4);
- //printNodeProperties(ddlNode);
-
- verifyNode(ddlNode, "HOTELAVAILABILITY", "ddl:startLineNumber");
- verifyNode(ddlNode, "SAMP.DEPARTMENT", "ddl:expression");
- verifyNode(ddlNode, "HOTEL_ID", "ddl:datatypeName");
- verifyNode(ddlNode, "CITIES", "ddl:startLineNumber");
-
- // Create Function
- verifyNode(ddlNode, "PROPERTY_FILE_READER", "ddl:startLineNumber", 71);
- verifyNodeTypes(ddlNode, "PROPERTY_FILE_READER",
- "derbyddl:createFunctionStatement",
- "ddl:creatable",
- "derbyddl:functionOperand");
- verifyNode(ddlNode, "KEY_COL", "ddl:datatypeName", "VARCHAR");
-
- Node functionNode = findNode(ddlNode, "TO_DEGREES");
- assertNotNull(functionNode);
- verifyChildNode(functionNode, "parameterStyle", "ddl:value", "PARAMETER STYLE JAVA");
-
- // Create Index
- // CREATE INDEX IXSALE ON SAMP.SALES (SALES);
- Node indexNode = findNode(ddlNode, "IXSALE", "derbyddl:createIndexStatement");
- assertNotNull(indexNode);
- verifySimpleStringProperty(indexNode, "derbyddl:tableName", "SAMP.SALES");
- Node colRefNode = findNode(indexNode, "SALES");
- assertNotNull(colRefNode);
- colRefNode = findNode(ddlNode, "SALES", "derbyddl:indexColumnReference");
- assertNotNull(colRefNode);
- verifyNodeTypes(colRefNode, "SALES",
- "derbyddl:indexColumnReference",
- "ddl:columnReference",
- "ddl:referenceOperand");
-
- // declare global temporary table SESSION.t1(c11 int) not logged;
- Node ttNode = findNode(ddlNode, "SESSION.t1", "derbyddl:declareGlobalTemporaryTableStatement");
- assertNotNull(ttNode);
- Node colNode = findNode(ttNode, "c11");
- assertNotNull(colNode);
- verifySimpleStringProperty(colNode, "ddl:datatypeName", "int");
-
- // LOCK TABLE FlightAvailability IN EXCLUSIVE MODE;
- Node lockNode = findNode(ddlNode, "FlightAvailability", "derbyddl:lockTableStatement");
- assertNotNull(lockNode);
- Node optionNode = findNode(lockNode, "lockMode");
- assertNotNull(optionNode);
- verifySimpleStringProperty(optionNode, "ddl:value", "EXCLUSIVE");
-
- // RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT
- Node renameTableNode = findNode(ddlNode, "SAMP.EMP_ACT", "derbyddl:renameTableStatement");
- assertNotNull(renameTableNode);
- verifySimpleStringProperty(renameTableNode, "ddl:newName", "EMPLOYEE_ACT");
-
- // CREATE SYNONYM SAMP.T1 FOR SAMP.TABLEWITHLONGNAME;
- Node synonymNode = findNode(ddlNode, "SAMP.T1", "derbyddl:createSynonymStatement");
- assertNotNull(synonymNode);
- verifySimpleStringProperty(synonymNode, "derbyddl:tableName", "SAMP.TABLEWITHLONGNAME");
-
- //CREATE TRIGGER FLIGHTSDELETE3
- // AFTER DELETE ON FLIGHTS
- // REFERENCING OLD AS OLD
- // FOR EACH ROW
- // DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID = OLD.FLIGHT_ID;
- Node triggerNode = findNode(ddlNode, "FLIGHTSDELETE3", "derbyddl:createTriggerStatement");
- assertNotNull(triggerNode);
- verifySimpleStringProperty(triggerNode, "derbyddl:tableName", "FLIGHTS");
-
- //CREATE TRIGGER t1 NO CASCADE BEFORE UPDATE ON x
- // FOR EACH ROW MODE DB2SQL
- // values app.notifyEmail('Jerry', 'Table x is about to be updated');
- triggerNode = findNode(ddlNode, "t1", "derbyddl:createTriggerStatement");
- assertNotNull(triggerNode);
- verifySimpleStringProperty(triggerNode, "derbyddl:tableName", "x");
- optionNode = findNode(triggerNode, "forEach");
- assertNotNull(optionNode);
- verifySimpleStringProperty(optionNode, "ddl:value", "FOR EACH ROW");
- optionNode = findNode(triggerNode, "eventType");
- assertNotNull(optionNode);
- verifySimpleStringProperty(optionNode, "ddl:value", "UPDATE");
-
- //GRANT EXECUTE ON PROCEDURE p TO george;
- Node grantNode = findNode(ddlNode, "p", "derbyddl:grantOnProcedureStatement");
+ //GRANT SELECT ON TABLE purchaseOrders TO maria,harry;
+ Node grantNode = findNode(ddlNode, "purchaseOrders", "ddl:grantOnTableStatement");
assertNotNull(grantNode);
+ Node granteeNode = findNode(grantNode, "maria", "ddl:grantee");
+ assertNotNull(granteeNode);
+ Node privNode = findNode(grantNode, "privilege", "ddl:grantPrivilege");
+ assertNotNull(privNode);
+ verifySingleValueProperty(privNode, "ddl:type", "SELECT");
- //GRANT purchases_reader_role TO george,maria;
- grantNode = findNode(ddlNode, "grantRoles", "derbyddl:grantRolesStatement");
+ //GRANT UPDATE, USAGE ON TABLE purchaseOrders FROM anita,zhi;
+ grantNode = findNode(ddlNode, "billedOrders", "ddl:grantOnTableStatement");
assertNotNull(grantNode);
- Node roleNode = findNode(grantNode, "george", "ddl:grantee");
- assertNotNull(roleNode);
-
+ privNode = findNode(grantNode, "privilege", "ddl:grantPrivilege");
+ assertNotNull(privNode);
+ verifySingleValueProperty(privNode, "ddl:type", "UPDATE");
+ granteeNode = findNode(grantNode, "anita", "ddl:grantee");
+ assertNotNull(granteeNode);
}
}
}
- System.out.println("FINISHED: shouldSequenceDerbyDdlFile(derby_test_statements.ddl)");
+ System.out.println("FINISHED: shouldSequenceStandardDdlGrantStatements(grant_test_statements.ddl)");
}
@Test
- public void shouldSequenceOracleDdlFile() throws Exception {
- System.out.println("STARTED: shouldSequenceOracleDdlFile(oracle_test_statements.ddl)");
- URL url = getUrl(cndDdlFolder + "oracle_test_statements.ddl");
+ public void shouldSequenceStandardDdlRevokeStatements() throws Exception {
+ System.out.println("STARTED: shouldSequenceStandardDdlRevokeStatements(revoke_test_statements.ddl)");
+ URL url = getUrl(ddlTestResourceRootFolder + "revoke_test_statements.ddl");
uploadFile(url);
waitUntilSequencedNodesIs(1);
@@ -380,335 +222,34 @@
//System.out.println(" | NAME: " + ddlsNode.getName() + " PATH: " + ddlsNode.getPath());
for (NodeIterator iter = ddlsNode.getNodes(); iter.hasNext();) {
Node ddlNode = iter.nextNode();
-
- long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
- assertEquals(numStatements, 50);
-
+
//printNodeProperties(ddlNode);
- verifyNode(ddlNode, "address", "ddl:startLineNumber");
- verifyNode(ddlNode, "cust_orders", "ddl:expression");
- verifyMixin(ddlNode, "cust_orders", "oracleddl:createIndexStatement");
- verifyNodeType(ddlNode, "cust_orders", "oracleddl:createIndexStatement");
- verifyNodeType(ddlNode, "cust_orders", "ddl:creatable");
- verifyNode(ddlNode, "cust_orders", "ddl:startCharIndex", 1698);
- verifyNode(ddlNode, "customers_dim", "ddl:startColumnNumber");
- }
- }
- }
-
- System.out.println("FINISHED: shouldSequenceOracleDdlFile(oracle_test_statements.ddl)");
- }
-
- @Test
- public void shouldSequencePostgresDdlFile() throws Exception {
- System.out.println("STARTED: shouldSequencePostgresDdlFile(postgres_test_statements.ddl)");
- URL url = getUrl(cndDdlFolder + "postgres_test_statements.ddl");
- uploadFile(url);
-
- waitUntilSequencedNodesIs(1);
-
- // Find the node ...
- Node root = session.getRootNode();
-
- if (root.hasNode("ddls") ) {
- if (root.hasNode("ddls")) {
- Node ddlsNode = root.getNode("ddls");
- //System.out.println(" | NAME: " + ddlsNode.getName() + " PATH: " + ddlsNode.getPath());
- for (NodeIterator iter = ddlsNode.getNodes(); iter.hasNext();) {
- Node ddlNode = iter.nextNode();
-
long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
- assertEquals(numStatements, 106);
+ assertEquals(numStatements, 4);
- //printNodeProperties(ddlNode);
+ //REVOKE SELECT ON TABLE purchaseOrders FROM maria,harry;
+ Node revokeNode = findNode(ddlNode, "purchaseOrders", "ddl:revokeOnTableStatement");
+ assertNotNull(revokeNode);
+ Node granteeNode = findNode(revokeNode, "maria", "ddl:grantee");
+ assertNotNull(granteeNode);
+ Node privNode = findNode(revokeNode, "privilege", "ddl:grantPrivilege");
+ assertNotNull(privNode);
+ verifySingleValueProperty(privNode, "ddl:type", "SELECT");
- verifyNodeType(ddlNode, "increment", "postgresddl:createFunctionStatement");
- verifyNode(ddlNode, "increment", "ddl:expression");
- verifyNodeType(ddlNode, "increment", "ddl:creatable");
- verifyNodeType(ddlNode, "increment", "postgresddl:functionOperand");
- verifyNode(ddlNode, "increment", "ddl:startLineNumber", 214);
- verifyNode(ddlNode, "increment", "ddl:startCharIndex", 7604);
-
-
- //COMMENT ON FUNCTION my_function (timestamp) IS ’Returns Roman Numeral’;
- verifyNodeType(ddlNode, "my_function", "postgresddl:commentOnStatement");
- verifyNode(ddlNode, "my_function", "ddl:expression");
- verifyNodeType(ddlNode, "my_function", "postgresddl:commentOperand");
- verifyNode(ddlNode, "my_function", "ddl:startLineNumber", 44);
- verifyNode(ddlNode, "my_function", "ddl:startCharIndex", 1573);
- verifyNode(ddlNode, "my_function", "postgresddl:comment", "'Returns Roman Numeral'");
-
- //ALTER TABLE foreign_companies RENAME COLUMN address TO city;
- Node alterTableNode = findNode(ddlNode, "foreign_companies", "postgresddl:alterTableStatement");
- assertNotNull(alterTableNode);
- Node renameColNode = findNode(alterTableNode, "address","postgresddl:renamedColumn");
- assertNotNull(renameColNode);
- verifySingleValueProperty(renameColNode, "ddl:newName", "city");
-
- //GRANT EXECUTE ON FUNCTION divideByTwo(numerator int, IN demoninator int) TO george;
- Node grantNode = findNode(ddlNode, "divideByTwo", "postgresddl:grantOnFunctionStatement");
- assertNotNull(grantNode);
- Node parameter_1 = findNode(grantNode, "numerator","postgresddl:functionParameter");
- assertNotNull(parameter_1);
- verifySingleValueProperty(parameter_1, "ddl:datatypeName", "int");
+ //REVOKE UPDATE, USAGE ON TABLE purchaseOrders FROM anita,zhi CASCADE;
+ revokeNode = findNode(ddlNode, "orderDetails", "ddl:revokeOnTableStatement");
+ assertNotNull(revokeNode);
+ privNode = findNode(revokeNode, "privilege", "ddl:grantPrivilege");
+ assertNotNull(privNode);
+ verifySingleValueProperty(privNode, "ddl:type", "UPDATE");
+ granteeNode = findNode(revokeNode, "anita", "ddl:grantee");
+ assertNotNull(granteeNode);
}
}
}
- System.out.println("FINISHED: shouldSequencePostgresDdlFile(postgres_test_statements.ddl)");
+ System.out.println("FINISHED: shouldSequenceStandardDdlRevokeStatements(revoke_test_statements.ddl)");
}
-
- protected class MyCustomSecurityContext implements SecurityContext {
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.SecurityContext#getUserName()
- */
- public String getUserName() {
- return "Fred";
- }
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.SecurityContext#hasRole(java.lang.String)
- */
- public boolean hasRole( String roleName ) {
- return true;
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.SecurityContext#logout()
- */
- public void logout() {
- // do something
- }
- }
-
- public void printNodeProperties(Node node) throws Exception {
- printProperties(node);
-
- for (NodeIterator iter = node.getNodes(); iter.hasNext();) {
- printNodeProperties(iter.nextNode());
- }
-
- }
-
- private void verifyChildNode(Node parentNode, String childNodeName, String propName, String expectedValue) throws Exception {
- // Find child node
- Node childNode = null;
- for (NodeIterator iter = parentNode.getNodes(); iter.hasNext();) {
- Node nextNode = iter.nextNode();
- if( nextNode.getName().equals(childNodeName)) {
- childNode = nextNode;
- break;
- }
- }
- if( childNode != null ) {
- assertThat( childNode.hasProperty(propName), is(true));
- verifySingleValueProperty(childNode, propName, expectedValue);
-
- } else {
- fail("NODE: " + childNodeName + " not found");
- }
-
- }
-
- private void verifyNode(Node topNode, String name, String propName) throws Exception {
- Node node = findNode(topNode, name);
-
- if( node != null ) {
- assertThat( node.hasProperty(propName), is(true));
- } else {
- fail("NODE: " + name + " not found");
- }
-
- }
-
- private void verifySimpleStringProperty(Node node, String propName, String expectedValue) throws Exception {
- assertThat( node.hasProperty(propName), is(true));
- verifySingleValueProperty(node, propName, expectedValue);
- }
-
- private void verifyNode(Node topNode, String name, String propName, String expectedValue) throws Exception {
- Node node = findNode(topNode, name);
-
- if( node != null ) {
- assertThat( node.hasProperty(propName), is(true));
- verifySingleValueProperty(node, propName, expectedValue);
-
- } else {
- fail("NODE: " + name + " not found");
- }
-
- }
-
- private void verifyNode(Node topNode, String name, String propName, int expectedValue) throws Exception {
- Node node = findNode(topNode, name);
-
- if( node != null ) {
- assertThat( node.hasProperty(propName), is(true));
- verifySingleValueProperty(node, propName, expectedValue);
-
- } else {
- fail("NODE: " + name + " not found");
- }
-
- }
-
- protected Value value( String value ) throws Exception {
- return session.getValueFactory().createValue(value);
- }
-
- private void verifySingleValueProperty(Node node, String propNameStr, String expectedValue) throws Exception {
- Value expValue = value(expectedValue);
- Property prop = node.getProperty(propNameStr);
- if( prop.getDefinition().isMultiple()) {
- boolean hasValue = false;
-
- Object[] values = prop.getValues();
- for( Object val : values) {
- if(val.equals(expValue)) {
- hasValue = true;
- }
- }
-
- assertThat(hasValue, is(true));
- } else {
- Object actualValue = prop.getValue();
- assertThat(expValue, is(actualValue));
- }
-
- }
-
-
- private void verifySingleValueProperty(Node node, String propNameStr, int expectedValue) throws Exception {
- Property prop = node.getProperty(propNameStr);
- Value expValue = session.getValueFactory().createValue(expectedValue);
- Object actualValue = prop.getValue();
- assertThat(expValue, is(actualValue));
-
- }
-
- private void verifyMixin(Node topNode, String nodeName, String nodeType) throws Exception {
- Node node = findNode(topNode, nodeName);
-
- if( node != null ) {
- verifyMixin(node, nodeType);
-
- } else {
- fail("NODE: " + nodeName + " not found");
- }
- }
-
- private boolean hasMixin(Node node, String nodeType) throws Exception {
- for( NodeType mixin : node.getMixinNodeTypes() ) {
- String mixinName = mixin.getName();
- if( mixinName.equals(nodeType) ) {
- return true;
- }
- }
- return false;
- }
-
- private void verifyMixin(Node node, String nodeType) throws Exception {
- boolean foundMixin = hasMixin(node, nodeType);
-
-
- assertThat(foundMixin, is(true));
- }
-
- private void verifyNodeType(Node topNode, String nodeName, String nodeTypeName) throws Exception {
- Node node = findNode(topNode, nodeName);
-
- if( node != null ) {
- assertThat(node.isNodeType(nodeTypeName), is(true));
- } else {
- fail("NODE: " + nodeName + " not found");
- }
-
- }
-
- private void verifyNodeTypes(Node topNode, String nodeName, String nodeTypeName, String...moreNodeTypeNames) throws Exception {
- Node node = findNode(topNode, nodeName);
-
- if( node != null ) {
- assertThat(node.isNodeType(nodeTypeName), is(true));
- for( String nextTypeName : moreNodeTypeNames ) {
- assertThat(node.isNodeType(nextTypeName), is(true));
- }
- } else {
- fail("NODE: " + nodeName + " not found");
- }
-
- }
-
- private Node findNode(Node node, String name) throws Exception {
- if( node.getName().equals(name)) {
- return node;
- }
- for (NodeIterator iter = node.getNodes(); iter.hasNext();) {
- Node nextNode = iter.nextNode();
- if( nextNode.getName().equals(name)) {
- return nextNode;
- }
- Node someNode = findNode(nextNode, name);
- if( someNode != null ) {
- return someNode;
- }
- }
-
- return null;
- }
-
- private Node findNode(Node node, String name, String type) throws Exception {
- if( node.getName().equals(name) && node.isNodeType(type)) { //(hasMixin(node, type) || node.isNodeType(type))) {
- return node;
- }
- for (NodeIterator iter = node.getNodes(); iter.hasNext();) {
- Node nextNode = iter.nextNode();
- if( nextNode.getName().equals(name) && node.isNodeType(type)) { //(hasMixin(node, type) || node.isNodeType(type))) {
- return nextNode;
- }
- Node someNode = findNode(nextNode, name, type);
- if( someNode != null ) {
- return someNode;
- }
- }
-
- return null;
- }
-
- private void printProperties( Node node ) throws RepositoryException, PathNotFoundException, ValueFormatException {
-
- System.out.println("\n >>> NODE PATH: " + node.getPath() );
- System.out.println(" NAME: " + node.getName() + "\n" );
-
- // Create a Properties object containing the properties for this node; ignore any children ...
- //Properties props = new Properties();
- for (PropertyIterator propertyIter = node.getProperties(); propertyIter.hasNext();) {
- Property property = propertyIter.nextProperty();
- String name = property.getName();
- String stringValue = null;
- if (property.getDefinition().isMultiple()) {
- StringBuilder sb = new StringBuilder();
- boolean first = true;
- for (Value value : property.getValues()) {
- if (!first) {
- sb.append(", ");
- first = false;
- }
- sb.append(value.getString());
- }
- stringValue = sb.toString();
- } else {
- stringValue = property.getValue().getString();
- }
- System.out.println(" | PROP: " + name + " VALUE: " + stringValue);
- //props.put(name, stringValue);
- }
- }
}
Added: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdlSequencerIntegrationTest.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdlSequencerIntegrationTest.java (rev 0)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdlSequencerIntegrationTest.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,220 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.test.integration.sequencer.ddl.dialect.derby;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import java.net.URL;
+import javax.jcr.Node;
+import javax.jcr.NodeIterator;
+import org.jboss.dna.graph.connector.inmemory.InMemoryRepositorySource;
+import org.jboss.dna.jcr.JcrConfiguration;
+import org.jboss.dna.jcr.JcrTools;
+import org.jboss.dna.jcr.SecurityContextCredentials;
+import org.jboss.dna.sequencer.ddl.StandardDdlLexicon;
+import org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon;
+import org.jboss.dna.test.integration.sequencer.ddl.DdlIntegrationTestUtil;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+/**
+ * @author blafond
+ *
+ */
+public class DerbyDdlSequencerIntegrationTest extends DdlIntegrationTestUtil {
+ private String resourceFolder = ddlTestResourceRootFolder + "/dialect/derby/";
+
+ @Before
+ public void beforeEach() throws Exception {
+ // Configure the DNA configuration. This could be done by loading a configuration from a file, or by
+ // using a (local or remote) configuration repository, or by setting up the configuration programmatically.
+ // This test uses the programmatic approach...
+
+ tools = new JcrTools();
+
+ String repositoryName = "ddlRepository";
+ String workspaceName = "default";
+ String repositorySource = "ddlRepositorySource";
+
+ JcrConfiguration config = new JcrConfiguration();
+ // Set up the in-memory source where we'll upload the content and where the sequenced output will be stored ...
+ config.repositorySource(repositorySource)
+ .usingClass(InMemoryRepositorySource.class)
+ .setDescription("The repository for our content")
+ .setProperty("defaultWorkspaceName", workspaceName);
+ // Set up the JCR repository to use the source ..protected.
+ config.repository(repositoryName)
+ .addNodeTypes(getUrl(ddlTestResourceRootFolder + "StandardDdl.cnd"))
+ .addNodeTypes(getUrl(resourceFolder + "DerbyDdl.cnd"))
+ .registerNamespace(StandardDdlLexicon.Namespace.PREFIX, StandardDdlLexicon.Namespace.URI)
+ .registerNamespace(DerbyDdlLexicon.Namespace.PREFIX, DerbyDdlLexicon.Namespace.URI)
+ .setSource(repositorySource);
+ // Set up the DDL sequencer ...
+ config.sequencer("DDL Sequencer")
+ .usingClass("org.jboss.dna.sequencer.ddl.DdlSequencer")
+ .loadedFromClasspath()
+ .setDescription("Sequences DDL files to extract individual statements and accompanying statement properties and values")
+ .sequencingFrom("//(*.(ddl)[*])/jcr:content[@jcr:data]")
+ .andOutputtingTo("/ddls/$1");
+ config.save();
+ this.engine = config.build();
+ this.engine.start();
+
+ this.session = this.engine.getRepository(repositoryName)
+ .login(new SecurityContextCredentials(new MyCustomSecurityContext()), workspaceName);
+
+ }
+
+ private URL getUrl(String urlStr) {
+ return this.getClass().getClassLoader().getResource(urlStr);
+ }
+
+ @After
+ public void afterEach() throws Exception {
+ if (this.session != null) {
+ this.session.logout();
+ }
+ if (this.engine != null) {
+ this.engine.shutdown();
+ }
+ }
+
+ @Test
+ public void shouldSequenceDerbyDdlFile() throws Exception {
+ System.out.println("STARTED: shouldSequenceDerbyDdlFile(derby_test_statements.ddl)");
+ URL url = getUrl(resourceFolder + "derby_test_statements.ddl");
+ uploadFile(url);
+
+ waitUntilSequencedNodesIs(1);
+
+ // Find the node ...
+ Node root = session.getRootNode();
+
+ if (root.hasNode("ddls") ) {
+ if (root.hasNode("ddls")) {
+ Node ddlsNode = root.getNode("ddls");
+ //System.out.println(" | NAME: " + ddlsNode.getName() + " PATH: " + ddlsNode.getPath());
+ for (NodeIterator iter = ddlsNode.getNodes(); iter.hasNext();) {
+ Node ddlNode = iter.nextNode();
+
+ long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
+ assertEquals(numStatements, 64);
+
+ //printNodeProperties(ddlNode);
+
+ verifyNode(ddlNode, "HOTELAVAILABILITY", "ddl:startLineNumber");
+ verifyNode(ddlNode, "SAMP.DEPARTMENT", "ddl:expression");
+ verifyNode(ddlNode, "HOTEL_ID", "ddl:datatypeName");
+ verifyNode(ddlNode, "CITIES", "ddl:startLineNumber");
+
+ // Create Function
+ verifyNode(ddlNode, "PROPERTY_FILE_READER", "ddl:startLineNumber", 71);
+ verifyNodeTypes(ddlNode, "PROPERTY_FILE_READER",
+ "derbyddl:createFunctionStatement",
+ "ddl:creatable",
+ "derbyddl:functionOperand");
+ verifyNode(ddlNode, "KEY_COL", "ddl:datatypeName", "VARCHAR");
+
+ Node functionNode = findNode(ddlNode, "TO_DEGREES");
+ assertNotNull(functionNode);
+ verifyChildNode(functionNode, "parameterStyle", "ddl:value", "PARAMETER STYLE JAVA");
+
+ // Create Index
+ // CREATE INDEX IXSALE ON SAMP.SALES (SALES);
+ Node indexNode = findNode(ddlNode, "IXSALE", "derbyddl:createIndexStatement");
+ assertNotNull(indexNode);
+ verifySimpleStringProperty(indexNode, "derbyddl:tableName", "SAMP.SALES");
+ Node colRefNode = findNode(indexNode, "SALES");
+ assertNotNull(colRefNode);
+ colRefNode = findNode(ddlNode, "SALES", "derbyddl:indexColumnReference");
+ assertNotNull(colRefNode);
+ verifyNodeTypes(colRefNode, "SALES",
+ "derbyddl:indexColumnReference",
+ "ddl:columnReference",
+ "ddl:referenceOperand");
+
+ // declare global temporary table SESSION.t1(c11 int) not logged;
+ Node ttNode = findNode(ddlNode, "SESSION.t1", "derbyddl:declareGlobalTemporaryTableStatement");
+ assertNotNull(ttNode);
+ Node colNode = findNode(ttNode, "c11");
+ assertNotNull(colNode);
+ verifySimpleStringProperty(colNode, "ddl:datatypeName", "int");
+
+ // LOCK TABLE FlightAvailability IN EXCLUSIVE MODE;
+ Node lockNode = findNode(ddlNode, "FlightAvailability", "derbyddl:lockTableStatement");
+ assertNotNull(lockNode);
+ Node optionNode = findNode(lockNode, "lockMode");
+ assertNotNull(optionNode);
+ verifySimpleStringProperty(optionNode, "ddl:value", "EXCLUSIVE");
+
+ // RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT
+ Node renameTableNode = findNode(ddlNode, "SAMP.EMP_ACT", "derbyddl:renameTableStatement");
+ assertNotNull(renameTableNode);
+ verifySimpleStringProperty(renameTableNode, "ddl:newName", "EMPLOYEE_ACT");
+
+ // CREATE SYNONYM SAMP.T1 FOR SAMP.TABLEWITHLONGNAME;
+ Node synonymNode = findNode(ddlNode, "SAMP.T1", "derbyddl:createSynonymStatement");
+ assertNotNull(synonymNode);
+ verifySimpleStringProperty(synonymNode, "derbyddl:tableName", "SAMP.TABLEWITHLONGNAME");
+
+ //CREATE TRIGGER FLIGHTSDELETE3
+ // AFTER DELETE ON FLIGHTS
+ // REFERENCING OLD AS OLD
+ // FOR EACH ROW
+ // DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID = OLD.FLIGHT_ID;
+ Node triggerNode = findNode(ddlNode, "FLIGHTSDELETE3", "derbyddl:createTriggerStatement");
+ assertNotNull(triggerNode);
+ verifySimpleStringProperty(triggerNode, "derbyddl:tableName", "FLIGHTS");
+
+ //CREATE TRIGGER t1 NO CASCADE BEFORE UPDATE ON x
+ // FOR EACH ROW MODE DB2SQL
+ // values app.notifyEmail('Jerry', 'Table x is about to be updated');
+ triggerNode = findNode(ddlNode, "t1", "derbyddl:createTriggerStatement");
+ assertNotNull(triggerNode);
+ verifySimpleStringProperty(triggerNode, "derbyddl:tableName", "x");
+ optionNode = findNode(triggerNode, "forEach");
+ assertNotNull(optionNode);
+ verifySimpleStringProperty(optionNode, "ddl:value", "FOR EACH ROW");
+ optionNode = findNode(triggerNode, "eventType");
+ assertNotNull(optionNode);
+ verifySimpleStringProperty(optionNode, "ddl:value", "UPDATE");
+
+ //GRANT EXECUTE ON PROCEDURE p TO george;
+ Node grantNode = findNode(ddlNode, "p", "derbyddl:grantOnProcedureStatement");
+ assertNotNull(grantNode);
+
+ //GRANT purchases_reader_role TO george,maria;
+ grantNode = findNode(ddlNode, "grantRoles", "derbyddl:grantRolesStatement");
+ assertNotNull(grantNode);
+ Node roleNode = findNode(grantNode, "george", "ddl:grantee");
+ assertNotNull(roleNode);
+
+ }
+ }
+ }
+
+ System.out.println("FINISHED: shouldSequenceDerbyDdlFile(derby_test_statements.ddl)");
+ }
+}
Property changes on: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdlSequencerIntegrationTest.java
___________________________________________________________________
Name: svn:mime-type
+ text/plain
Added: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdlSequencerIntegrationTest.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdlSequencerIntegrationTest.java (rev 0)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdlSequencerIntegrationTest.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,140 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.test.integration.sequencer.ddl.dialect.oracle;
+
+import static org.junit.Assert.assertEquals;
+import java.net.URL;
+import javax.jcr.Node;
+import javax.jcr.NodeIterator;
+import org.jboss.dna.graph.connector.inmemory.InMemoryRepositorySource;
+import org.jboss.dna.jcr.JcrConfiguration;
+import org.jboss.dna.jcr.JcrTools;
+import org.jboss.dna.jcr.SecurityContextCredentials;
+import org.jboss.dna.sequencer.ddl.StandardDdlLexicon;
+import org.jboss.dna.sequencer.ddl.dialect.oracle.OracleDdlLexicon;
+import org.jboss.dna.test.integration.sequencer.ddl.DdlIntegrationTestUtil;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+/**
+ * @author blafond
+ *
+ */
+public class OracleDdlSequencerIntegrationTest extends DdlIntegrationTestUtil {
+ private String resourceFolder = ddlTestResourceRootFolder + "/dialect/oracle/";
+
+ @Before
+ public void beforeEach() throws Exception {
+ // Configure the DNA configuration. This could be done by loading a configuration from a file, or by
+ // using a (local or remote) configuration repository, or by setting up the configuration programmatically.
+ // This test uses the programmatic approach...
+
+ tools = new JcrTools();
+
+ String repositoryName = "ddlRepository";
+ String workspaceName = "default";
+ String repositorySource = "ddlRepositorySource";
+
+ JcrConfiguration config = new JcrConfiguration();
+ // Set up the in-memory source where we'll upload the content and where the sequenced output will be stored ...
+ config.repositorySource(repositorySource)
+ .usingClass(InMemoryRepositorySource.class)
+ .setDescription("The repository for our content")
+ .setProperty("defaultWorkspaceName", workspaceName);
+ // Set up the JCR repository to use the source ...
+ config.repository(repositoryName)
+ .addNodeTypes(getUrl(ddlTestResourceRootFolder + "StandardDdl.cnd"))
+ .addNodeTypes(getUrl(resourceFolder + "OracleDdl.cnd"))
+ .registerNamespace(StandardDdlLexicon.Namespace.PREFIX, StandardDdlLexicon.Namespace.URI)
+ .registerNamespace(OracleDdlLexicon.Namespace.PREFIX, OracleDdlLexicon.Namespace.URI)
+ .setSource(repositorySource);
+ // Set up the DDL sequencer ...
+ config.sequencer("DDL Sequencer")
+ .usingClass("org.jboss.dna.sequencer.ddl.DdlSequencer")
+ .loadedFromClasspath()
+ .setDescription("Sequences DDL files to extract individual statements and accompanying statement properties and values")
+ .sequencingFrom("//(*.(ddl)[*])/jcr:content[@jcr:data]")
+ .andOutputtingTo("/ddls/$1");
+ config.save();
+ this.engine = config.build();
+ this.engine.start();
+
+ this.session = this.engine.getRepository(repositoryName)
+ .login(new SecurityContextCredentials(new MyCustomSecurityContext()), workspaceName);
+
+ }
+
+ private URL getUrl(String urlStr) {
+ return this.getClass().getClassLoader().getResource(urlStr);
+ }
+
+ @After
+ public void afterEach() throws Exception {
+ if (this.session != null) {
+ this.session.logout();
+ }
+ if (this.engine != null) {
+ this.engine.shutdown();
+ }
+ }
+
+
+ @Test
+ public void shouldSequenceOracleDdlFile() throws Exception {
+ System.out.println("STARTED: shouldSequenceOracleDdlFile(oracle_test_statements.ddl)");
+ URL url = getUrl(resourceFolder + "oracle_test_statements.ddl");
+ uploadFile(url);
+
+ waitUntilSequencedNodesIs(1);
+
+ // Find the node ...
+ Node root = session.getRootNode();
+
+ if (root.hasNode("ddls") ) {
+ if (root.hasNode("ddls")) {
+ Node ddlsNode = root.getNode("ddls");
+ //System.out.println(" | NAME: " + ddlsNode.getName() + " PATH: " + ddlsNode.getPath());
+ for (NodeIterator iter = ddlsNode.getNodes(); iter.hasNext();) {
+ Node ddlNode = iter.nextNode();
+
+ long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
+ assertEquals(numStatements, 50);
+
+ //printNodeProperties(ddlNode);
+
+ verifyNode(ddlNode, "address", "ddl:startLineNumber");
+ verifyNode(ddlNode, "cust_orders", "ddl:expression");
+ verifyMixin(ddlNode, "cust_orders", "oracleddl:createIndexStatement");
+ verifyNodeType(ddlNode, "cust_orders", "oracleddl:createIndexStatement");
+ verifyNodeType(ddlNode, "cust_orders", "ddl:creatable");
+ verifyNode(ddlNode, "cust_orders", "ddl:startCharIndex", 1698);
+ verifyNode(ddlNode, "customers_dim", "ddl:startColumnNumber");
+ }
+ }
+ }
+
+ System.out.println("FINISHED: shouldSequenceOracleDdlFile(oracle_test_statements.ddl)");
+ }
+}
Property changes on: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdlSequencerIntegrationTest.java
___________________________________________________________________
Name: svn:mime-type
+ text/plain
Added: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdlSequencerIntegrationTest.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdlSequencerIntegrationTest.java (rev 0)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdlSequencerIntegrationTest.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,161 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.test.integration.sequencer.ddl.dialect.postgres;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import java.net.URL;
+import javax.jcr.Node;
+import javax.jcr.NodeIterator;
+import org.jboss.dna.graph.connector.inmemory.InMemoryRepositorySource;
+import org.jboss.dna.jcr.JcrConfiguration;
+import org.jboss.dna.jcr.JcrTools;
+import org.jboss.dna.jcr.SecurityContextCredentials;
+import org.jboss.dna.sequencer.ddl.StandardDdlLexicon;
+import org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon;
+import org.jboss.dna.test.integration.sequencer.ddl.DdlIntegrationTestUtil;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+/**
+ *
+ */
+public class PostgresDdlSequencerIntegrationTest extends DdlIntegrationTestUtil {
+ private String resourceFolder = ddlTestResourceRootFolder + "/dialect/postgres/";
+
+ @Before
+ public void beforeEach() throws Exception {
+ // Configure the DNA configuration. This could be done by loading a configuration from a file, or by
+ // using a (local or remote) configuration repository, or by setting up the configuration programmatically.
+ // This test uses the programmatic approach...
+
+ tools = new JcrTools();
+
+ String repositoryName = "ddlRepository";
+ String workspaceName = "default";
+ String repositorySource = "ddlRepositorySource";
+
+ JcrConfiguration config = new JcrConfiguration();
+ // Set up the in-memory source where we'll upload the content and where the sequenced output will be stored ...
+ config.repositorySource(repositorySource)
+ .usingClass(InMemoryRepositorySource.class)
+ .setDescription("The repository for our content")
+ .setProperty("defaultWorkspaceName", workspaceName);
+ // Set up the JCR repository to use the source ...
+ config.repository(repositoryName)
+ .addNodeTypes(getUrl(ddlTestResourceRootFolder + "StandardDdl.cnd"))
+ .addNodeTypes(getUrl(resourceFolder + "PostgresDdl.cnd"))
+ .registerNamespace(StandardDdlLexicon.Namespace.PREFIX, StandardDdlLexicon.Namespace.URI)
+ .registerNamespace(PostgresDdlLexicon.Namespace.PREFIX, PostgresDdlLexicon.Namespace.URI)
+ .setSource(repositorySource);
+ // Set up the DDL sequencer ...
+ config.sequencer("DDL Sequencer")
+ .usingClass("org.jboss.dna.sequencer.ddl.DdlSequencer")
+ .loadedFromClasspath()
+ .setDescription("Sequences DDL files to extract individual statements and accompanying statement properties and values")
+ .sequencingFrom("//(*.(ddl)[*])/jcr:content[@jcr:data]")
+ .andOutputtingTo("/ddls/$1");
+ config.save();
+ this.engine = config.build();
+ this.engine.start();
+
+ this.session = this.engine.getRepository(repositoryName)
+ .login(new SecurityContextCredentials(new MyCustomSecurityContext()), workspaceName);
+
+ }
+
+ private URL getUrl(String urlStr) {
+ return this.getClass().getClassLoader().getResource(urlStr);
+ }
+
+ @After
+ public void afterEach() throws Exception {
+ if (this.session != null) {
+ this.session.logout();
+ }
+ if (this.engine != null) {
+ this.engine.shutdown();
+ }
+ }
+
+ @Test
+ public void shouldSequencePostgresDdlFile() throws Exception {
+ System.out.println("STARTED: shouldSequencePostgresDdlFile(postgres_test_statements.ddl)");
+ URL url = getUrl(resourceFolder + "postgres_test_statements.ddl");
+ uploadFile(url);
+
+ waitUntilSequencedNodesIs(1);
+
+ // Find the node ...
+ Node root = session.getRootNode();
+
+ if (root.hasNode("ddls") ) {
+ if (root.hasNode("ddls")) {
+ Node ddlsNode = root.getNode("ddls");
+ //System.out.println(" | NAME: " + ddlsNode.getName() + " PATH: " + ddlsNode.getPath());
+ for (NodeIterator iter = ddlsNode.getNodes(); iter.hasNext();) {
+ Node ddlNode = iter.nextNode();
+
+ long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
+ assertEquals(numStatements, 106);
+
+ //printNodeProperties(ddlNode);
+
+ verifyNodeType(ddlNode, "increment", "postgresddl:createFunctionStatement");
+ verifyNode(ddlNode, "increment", "ddl:expression");
+ verifyNodeType(ddlNode, "increment", "ddl:creatable");
+ verifyNodeType(ddlNode, "increment", "postgresddl:functionOperand");
+ verifyNode(ddlNode, "increment", "ddl:startLineNumber", 214);
+ verifyNode(ddlNode, "increment", "ddl:startCharIndex", 7604);
+
+
+ //COMMENT ON FUNCTION my_function (timestamp) IS ’Returns Roman Numeral’;
+ verifyNodeType(ddlNode, "my_function", "postgresddl:commentOnStatement");
+ verifyNode(ddlNode, "my_function", "ddl:expression");
+ verifyNodeType(ddlNode, "my_function", "postgresddl:commentOperand");
+ verifyNode(ddlNode, "my_function", "ddl:startLineNumber", 44);
+ verifyNode(ddlNode, "my_function", "ddl:startCharIndex", 1573);
+ verifyNode(ddlNode, "my_function", "postgresddl:comment", "'Returns Roman Numeral'");
+
+ //ALTER TABLE foreign_companies RENAME COLUMN address TO city;
+ Node alterTableNode = findNode(ddlNode, "foreign_companies", "postgresddl:alterTableStatement");
+ assertNotNull(alterTableNode);
+ Node renameColNode = findNode(alterTableNode, "address","postgresddl:renamedColumn");
+ assertNotNull(renameColNode);
+ verifySingleValueProperty(renameColNode, "ddl:newName", "city");
+
+ //GRANT EXECUTE ON FUNCTION divideByTwo(numerator int, IN demoninator int) TO george;
+ Node grantNode = findNode(ddlNode, "divideByTwo", "postgresddl:grantOnFunctionStatement");
+ assertNotNull(grantNode);
+ Node parameter_1 = findNode(grantNode, "numerator","postgresddl:functionParameter");
+ assertNotNull(parameter_1);
+ verifySingleValueProperty(parameter_1, "ddl:datatypeName", "int");
+ }
+ }
+ }
+
+ System.out.println("FINISHED: shouldSequencePostgresDdlFile(postgres_test_statements.ddl)");
+ }
+}
Property changes on: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdlSequencerIntegrationTest.java
___________________________________________________________________
Name: svn:mime-type
+ text/plain
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -1,102 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-
- //------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-<derbyddl='http://www.jboss.org/dna/ddl/derby/1.0'>
-
-// =============================================================================
-// OPERANDS
-// =============================================================================
-[derbyddl:functionOperand] > ddl:operand abstract
-[derbyddl:indexOperand] > ddl:operand abstract
-[derbyddl:procedureOperand] > ddl:operand abstract
-[derbyddl:roleOperand] > ddl:operand abstract
-[derbyddl:synonymOperand] > ddl:operand abstract
-[derbyddl:triggerOperand] > ddl:operand abstract
-
-[derbyddl:roleName] > derbyddl:roleOperand mixin
-
-// =============================================================================
-// COLUMN
-// =============================================================================
-[derbyddl:columnDefinition] > ddl:columnDefinition mixin
- - derbyddl:dropDefault (boolean)
-
-[derbyddl:functionParameter] > ddl:columnDefinition mixin
-
-[derbyddl:indexColumnReference] > ddl:columnReference mixin
- - derbyddl:order (STRING)
-
-// =============================================================================
-// CREATE STATEMENTS
-// =============================================================================
-[derbyddl:createFunctionStatement] > ddl:creatable, ddl:statement, derbyddl:functionOperand mixin
- - ddl:datatypeName (STRING) mandatory
- - ddl:datatypeLength (LONG)
- - ddl:datatypePrecision (LONG)
- - ddl:datatypeScale (LONG)
- - ddl:isTableType (boolean)
- + * (derbyddl:functionParameter) = derbyddl:functionParameter multiple
- + * (ddl:statementOption) = ddl:statementOption multiple
-[derbyddl:createIndexStatement] > ddl:statement, ddl:creatable, derbyddl:indexOperand mixin
- - derbyddl:tableName (string) mandatory
- - derbyddl:unique (boolean)
- + * (derbyddl:indexColumnReference) = derbyddl:indexColumnReference multiple
-[derbyddl:createProcedureStatement] > ddl:creatable, ddl:statement, derbyddl:procedureOperand mixin
-[derbyddl:createRoleStatement] > ddl:creatable, ddl:statement, derbyddl:roleOperand mixin
-[derbyddl:createSynonymStatement] > ddl:creatable, ddl:statement, derbyddl:synonymOperand mixin
- - derbyddl:tableName (string) mandatory
-[derbyddl:createTriggerStatement] > ddl:creatable, ddl:statement, derbyddl:triggerOperand mixin
- - derbyddl:tableName (string) mandatory
- - ddl:sql (string) mandatory
- + * (ddl:columnReference) = ddl:columnreference multiple
-[derbyddl:declareGlobalTemporaryTableStatement] > ddl:createTableStatement mixin
-
-// =============================================================================
-// DROP STATEMENTS
-// =============================================================================
-[derbyddl:dropFunctionStatement] > ddl:droppable, derbyddl:functionOperand mixin
-[derbyddl:dropIndexStatement] > ddl:droppable, derbyddl:indexOperand mixin
-[derbyddl:dropProcedureStatement] > ddl:droppable, derbyddl:procedureOperand mixin
-[derbyddl:dropRoleStatement] > ddl:droppable, derbyddl:roleOperand mixin
-[derbyddl:dropSynonymStatement] > ddl:droppable, derbyddl:synonymOperand mixin
-[derbyddl:dropTriggerStatement] > ddl:droppable, derbyddl:triggerOperand mixin
-
-// =============================================================================
-// MISC STATEMENTS
-// =============================================================================
-[derbyddl:lockTableStatement] > ddl:statement, ddl:tableOperand mixin
-[derbyddl:renameTableStatement] > ddl:statement, ddl:renamable, ddl:tableOperand mixin
-
-[derbyddl:grantOnFunctionStatement] > ddl:grantStatement, derbyddl:functionOperand mixin
-[derbyddl:grantOnProcedureStatement] > ddl:grantStatement, derbyddl:procedureOperand mixin
-
-[derbyddl:grantRolesStatement] > ddl:grantStatement mixin
- + ddl:name (derbyddl:roleName) = derbyddl:roleName multiple
\ No newline at end of file
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/OracleDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/OracleDdl.cnd 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/OracleDdl.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -1,202 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-
- //------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-<oracleddl='http://www.jboss.org/dna/ddl/oracle/1.0'>
-
-// =============================================================================
-// OPERANDS
-// =============================================================================
-[oracleddl:clusterOperand] > ddl:operand abstract
-[oracleddl:commentOperand] > ddl:operand abstract
-[oracleddl:contextOperand] > ddl:operand abstract
-[oracleddl:controlfileOperand] > ddl:operand abstract
-[oracleddl:databaseOperand] > ddl:operand abstract
-[oracleddl:dimensionOperand] > ddl:operand abstract
-[oracleddl:directoryOperand] > ddl:operand abstract
-[oracleddl:diskgroupOperand] > ddl:operand abstract
-[oracleddl:functionOperand] > ddl:operand abstract
-[oracleddl:indexOperand] > ddl:operand abstract
-[oracleddl:indextypeOperand] > ddl:operand abstract
-[oracleddl:javaOperand] > ddl:operand abstract
-[oracleddl:libraryOperand] > ddl:operand abstract
-[oracleddl:materializedOperand] > ddl:operand abstract
-[oracleddl:operatorOperand] > ddl:operand abstract
-[oracleddl:outlineOperand] > ddl:operand abstract
-[oracleddl:packageOperand] > ddl:operand abstract
-[oracleddl:pfileOperand] > ddl:operand abstract
-[oracleddl:procedureOperand] > ddl:operand abstract
-[oracleddl:profileOperand] > ddl:operand abstract
-[oracleddl:resourceOperand] > ddl:operand abstract
-[oracleddl:roleOperand] > ddl:operand abstract
-[oracleddl:rollbackOperand] > ddl:operand abstract
-[oracleddl:sequenceOperand] > ddl:operand abstract
-[oracleddl:sessionOperand] > ddl:operand abstract
-[oracleddl:spfileOperand] > ddl:operand abstract
-[oracleddl:systemOperand] > ddl:operand abstract
-[oracleddl:synonymOperand] > ddl:operand abstract
-[oracleddl:tablespaceOperand] > ddl:operand abstract
-[oracleddl:triggerOperand] > ddl:operand abstract
-[oracleddl:typeOperand] > ddl:operand abstract
-[oracleddl:userOperand] > ddl:operand abstract
-
-// =============================================================================
-// COLUMN
-// =============================================================================
-[oracleddl:columnDefinition] > ddl:columnDefinition
- - oracleddl:dropDefault (boolean)
-
-// =============================================================================
-// ALTER STATEMENTS
-// =============================================================================
-[oracleddl:alterClusterStatement] > ddl:alterable, ddl:statement, oracleddl:clusterOperand mixin
-[oracleddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, oracleddl:databaseOperand mixin
-[oracleddl:alterDimensionStatement] > ddl:alterable, ddl:statement, oracleddl:dimensionOperand mixin
-[oracleddl:alterDiskgroupStatement] > ddl:alterable, ddl:statement, oracleddl:diskgroupOperand mixin
-[oracleddl:alterFunctionStatement] > ddl:alterable, ddl:statement, oracleddl:functionOperand mixin
-[oracleddl:alterIndexStatement] > ddl:alterable, ddl:statement, oracleddl:indexOperand mixin
-[oracleddl:alterIndextypeStatement] > ddl:alterable, ddl:statement, oracleddl:indextypeOperand mixin
-[oracleddl:alterJavaStatement] > ddl:alterable, ddl:statement, oracleddl:javaOperand mixin
-[oracleddl:alterMaterializedStatement] > ddl:alterable, ddl:statement, oracleddl:materializedOperand mixin
-[oracleddl:alterOperatorStatement] > ddl:alterable, ddl:statement, oracleddl:operatorOperand mixin
-[oracleddl:alterOutlineStatement] > ddl:alterable, ddl:statement, oracleddl:outlineOperand mixin
-[oracleddl:alterPackageStatement] > ddl:alterable, ddl:statement, oracleddl:packageOperand mixin
-[oracleddl:alterProcedureStatement] > ddl:alterable, ddl:statement, oracleddl:procedureOperand mixin
-[oracleddl:alterProfileStatement] > ddl:alterable, ddl:statement, oracleddl:profileOperand mixin
-[oracleddl:alterResourceStatement] > ddl:alterable, ddl:statement, oracleddl:resourceOperand mixin
-[oracleddl:alterRoleStatement] > ddl:alterable, ddl:statement, oracleddl:roleOperand mixin
-[oracleddl:alterRollbackStatement] > ddl:alterable, ddl:statement, oracleddl:rollbackOperand mixin
-[oracleddl:alterSequenceStatement] > ddl:alterable, ddl:statement, oracleddl:sequenceOperand mixin
-[oracleddl:alterSessionStatement] > ddl:alterable, ddl:statement, oracleddl:sessionOperand mixin
-[oracleddl:alterSystemStatement] > ddl:alterable, ddl:statement, oracleddl:systemOperand mixin
-[oracleddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, oracleddl:tablespaceOperand mixin
-[oracleddl:alterTriggerStatement] > ddl:alterable, ddl:statement, oracleddl:triggerOperand mixin
-[oracleddl:alterTypeStatement] > ddl:alterable, ddl:statement, oracleddl:typeOperand mixin
-[oracleddl:alterUserStatement] > ddl:alterable, ddl:statement, oracleddl:userOperand mixin
-[oracleddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
-
-[oracleddl:alterTableStatement] > ddl:alterTableStatement mixin
- - oracleddl:newTableName (STRING)
- + oracleddl:renameColumn (ddl:renamable) = ddl:renamable multiple
- + oracleddl:renameConstraint (ddl:renamable) = ddl:renamable multiple
-
-// =============================================================================
-// CREATE STATEMENTS
-// =============================================================================
-
-[oracleddl:createClusterStatement] > ddl:creatable, ddl:statement, oracleddl:clusterOperand mixin
-[oracleddl:createContextStatement] > ddl:creatable, ddl:statement, oracleddl:contextOperand mixin
-[oracleddl:createControlfileStatement] > ddl:creatable, ddl:statement, oracleddl:controlfileOperand mixin
-[oracleddl:createDatabaseStatement] > ddl:creatable, ddl:statement, oracleddl:databaseOperand mixin
-[oracleddl:createDimensionStatement] > ddl:creatable, ddl:statement, oracleddl:dimensionOperand mixin
-[oracleddl:createDirectoryStatement] > ddl:creatable, ddl:statement, oracleddl:directoryOperand mixin
-[oracleddl:createDiskgroupStatement] > ddl:creatable, ddl:statement, oracleddl:diskgroupOperand mixin
-[oracleddl:createFunctionStatement] > ddl:creatable, ddl:statement, oracleddl:functionOperand mixin
-[oracleddl:createIndexStatement] > ddl:creatable, ddl:statement, oracleddl:indexOperand mixin
- - oracleddl:tableName (string) mandatory
- - oracleddl:unique (boolean)
- - oracleddl:bitmap (boolean)
- + * (ddl:columnReference) = ddl:columnReference multiple
-[oracleddl:createIndexTypeStatement] > ddl:creatable, ddl:statement, oracleddl:indextypeOperand mixin
-[oracleddl:createJavaStatement] > ddl:creatable, ddl:statement, oracleddl:javaOperand mixin
-[oracleddl:createLibraryStatement] > ddl:creatable, ddl:statement, oracleddl:libraryOperand mixin
-[oracleddl:createMaterializedStatement] > ddl:creatable, ddl:statement, oracleddl:materializedOperand mixin
-[oracleddl:createOperatorStatement] > ddl:creatable, ddl:statement, oracleddl:operatorOperand mixin
-[oracleddl:createOutlineStatement] > ddl:creatable, ddl:statement, oracleddl:outlineOperand mixin
-[oracleddl:createPackageStatement] > ddl:creatable, ddl:statement, oracleddl:packageOperand mixin
-[oracleddl:createPfileStatement] > ddl:creatable, ddl:statement, oracleddl:pfileOperand mixin
-[oracleddl:createProcedureStatement] > ddl:creatable, ddl:statement, oracleddl:procedureOperand mixin
-[oracleddl:createRoleStatement] > ddl:creatable, ddl:statement, oracleddl:roleOperand mixin
-[oracleddl:createRollbackStatement] > ddl:creatable, ddl:statement, oracleddl:rollbackOperand mixin
-[oracleddl:createSequenceStatement] > ddl:creatable, ddl:statement, oracleddl:sequenceOperand mixin
-[oracleddl:createSpfileStatement] > ddl:creatable, ddl:statement, oracleddl:spfileOperand mixin
-[oracleddl:createSynonymStatement] > ddl:creatable, ddl:statement, oracleddl:synonymOperand mixin
-[oracleddl:createTablespaceStatement] > ddl:creatable, ddl:statement, oracleddl:tablespaceOperand mixin
-[oracleddl:createTriggerStatement] > ddl:creatable, ddl:statement, oracleddl:triggerOperand mixin
-[oracleddl:createTypeStatement] > ddl:creatable, ddl:statement, oracleddl:typeOperand mixin
-[oracleddl:createUserStatement] > ddl:creatable, ddl:statement, oracleddl:userOperand mixin
-
-// =============================================================================
-// DROP STATEMENTS
-// =============================================================================
-
-[oracleddl:dropClusterStatement] > ddl:droppable, ddl:statement, oracleddl:clusterOperand mixin
-[oracleddl:dropContextStatement] > ddl:droppable, ddl:statement, oracleddl:contextOperand mixin
-[oracleddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, oracleddl:databaseOperand mixin
-[oracleddl:dropDimensionStatement] > ddl:droppable, ddl:statement, oracleddl:dimensionOperand mixin
-[oracleddl:dropDirectoryStatement] > ddl:droppable, ddl:statement, oracleddl:directoryOperand mixin
-[oracleddl:dropDiskgroupStatement] > ddl:droppable, ddl:statement, oracleddl:diskgroupOperand mixin
-[oracleddl:dropFunctionStatement] > ddl:droppable, ddl:statement, oracleddl:functionOperand mixin
-[oracleddl:dropIndexStatement] > ddl:droppable, ddl:statement, oracleddl:indexOperand mixin
-[oracleddl:dropIndextypeStatement] > ddl:droppable, ddl:statement, oracleddl:indextypeOperand mixin
-[oracleddl:dropJavaStatement] > ddl:droppable, ddl:statement, oracleddl:javaOperand mixin
-[oracleddl:dropLibraryStatement] > ddl:droppable, ddl:statement, oracleddl:libraryOperand mixin
-[oracleddl:dropMaterializedStatement] > ddl:droppable, ddl:statement, oracleddl:materializedOperand mixin
-[oracleddl:dropOperatorStatement] > ddl:droppable, ddl:statement, oracleddl:operatorOperand mixin
-[oracleddl:dropOutlineStatement] > ddl:droppable, ddl:statement, oracleddl:outlineOperand mixin
-[oracleddl:dropPackageStatement] > ddl:droppable, ddl:statement, oracleddl:packageOperand mixin
-[oracleddl:dropProcedureStatement] > ddl:droppable, ddl:statement, oracleddl:procedureOperand mixin
-[oracleddl:dropProfileStatement] > ddl:droppable, ddl:statement, oracleddl:profileOperand mixin
-[oracleddl:dropRoleStatement] > ddl:droppable, ddl:statement, oracleddl:roleOperand mixin
-[oracleddl:dropRollbackStatement] > ddl:droppable, ddl:statement, oracleddl:rollbackOperand mixin
-[oracleddl:dropSequenceStatement] > ddl:droppable, ddl:statement, oracleddl:sequenceOperand mixin
-[oracleddl:dropSynonymStatement] > ddl:droppable, ddl:statement, oracleddl:synonymOperand mixin
-[oracleddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, oracleddl:tablespaceOperand mixin
-[oracleddl:dropTriggerStatement] > ddl:droppable, ddl:statement, oracleddl:triggerOperand mixin
-[oracleddl:dropTypeStatement] > ddl:droppable, ddl:statement, oracleddl:typeOperand mixin
-[oracleddl:dropUserStatement] > ddl:droppable, ddl:statement, oracleddl:userOperand mixin
-
-// =============================================================================
-// MISC STATEMENTS
-// =============================================================================
-
-[oracleddl:analyzeStatement] > ddl:statement mixin
-[oracleddl:associateStatisticsStatement] > ddl:statement mixin
-[oracleddl:auditStatement] > ddl:statement mixin
-[oracleddl:commitStatement] > ddl:statement mixin
-[oracleddl:commentOnStatement] > ddl:statement, oracleddl:commentOperand mixin
- - oracleddl:targetObjectType (STRING) mandatory
- - oracleddl:targetObjectName (STRING)
- - oracleddl:comment (STRING) mandatory
-[oracleddl:disassociateStatisticsStatement] > ddl:statement mixin
-[oracleddl:explainPlanStatement] > ddl:statement mixin
-[oracleddl:flashbackStatement] > ddl:statement mixin
-[oracleddl:lockTableStatement] > ddl:statement mixin
-[oracleddl:mergeStatement] > ddl:statement mixin
-[oracleddl:nestedTableStatement] > ddl:statement mixin
-[oracleddl:noauditStatement] > ddl:statement mixin
-[oracleddl:purgeStatement] > ddl:statement mixin
-[oracleddl:renameStatement] > ddl:statement mixin
-[oracleddl:revokeStatement] > ddl:statement mixin
-[oracleddl:rollbackStatement] > ddl:statement mixin
-[oracleddl:setConstraintsStatement] > ddl:statement, ddl:settable mixin
-[oracleddl:setRoleStatement] > ddl:statement, ddl:settable mixin
-[oracleddl:setTransactionStatement] > ddl:statement, ddl:settable mixin
-[oracleddl:truncateStatement] > ddl:statement mixin
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -1,205 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-
- //------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-<postgresddl='http://www.jboss.org/dna/ddl/postgres/1.0'>
-
-// =============================================================================
-// OPERANDS
-// =============================================================================
-[postgresddl:aggregateOperand] > ddl:operand abstract
-[postgresddl:castOperand] > ddl:operand abstract
-[postgresddl:commentOperand] > ddl:operand abstract
-[postgresddl:constraintTriggerOperand] > ddl:operand abstract
-[postgresddl:conversionOperand] > ddl:operand abstract
-[postgresddl:databaseOperand] > ddl:operand abstract
-[postgresddl:foreignDataOperand] > ddl:operand abstract
-[postgresddl:groupOperand] > ddl:operand abstract
-[postgresddl:functionOperand] > ddl:operand abstract
-[postgresddl:indexOperand] > ddl:operand abstract
-[postgresddl:languageOperand] > ddl:operand abstract
-[postgresddl:operatorOperand] > ddl:operand abstract
-[postgresddl:ownedByOperand] > ddl:operand abstract
-[postgresddl:roleOperand] > ddl:operand abstract
-[postgresddl:ruleOperand] > ddl:operand abstract
-[postgresddl:sequenceOperand] > ddl:operand abstract
-[postgresddl:serverOperand] > ddl:operand abstract
-[postgresddl:tablespaceOperand] > ddl:operand abstract
-[postgresddl:textSearchOperand] > ddl:operand abstract
-[postgresddl:triggerOperand] > ddl:operand abstract
-[postgresddl:typeOperand] > ddl:operand abstract
-[postgresddl:userOperand] > ddl:operand abstract
-[postgresddl:userMappingOperand] > ddl:operand abstract
-[postgresddl:parameterOperand] > ddl:operand abstract
-
-[postgresddl:functionParameter] > postgresddl:parameterOperand mixin
- - ddl:datatypeName (STRING) mandatory
- - ddl:datatypeLength (LONG)
- - ddl:datatypePrecision (LONG)
- - ddl:datatypeScale (LONG)
- - ddl:nullable (STRING)
- - ddl:defaultOption (STRING)
- - postgresddl:mode (STRING)
-
-[postgresddl:role] > postgresddl:roleOperand mixin
-
-[postgresddl:renamedColumn] > ddl:renamable mixin
-
-// =============================================================================
-// ALTER STATEMENTS
-// =============================================================================
-[postgresddl:alterAggregateStatement] > ddl:alterable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:alterConversionStatement] > ddl:alterable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:alterForeignDataWrapperStatement] > ddl:alterable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:alterFunctionStatement] > ddl:alterable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:alterGroupStatement] > ddl:alterable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:alterIndexStatement] > ddl:alterable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:alterLanguageStatement] > ddl:alterable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:alterOperatorStatement] > ddl:alterable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:alterRoleStatement] > ddl:alterable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:alterSchemaStatement] > ddl:alterable, ddl:statement, ddl:schemaOperand mixin
-[postgresddl:alterSequenceStatement] > ddl:alterable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:alterServerStatement] > ddl:alterable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:alterTextSearchStatement] > ddl:alterable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:alterTriggerStatement] > ddl:alterable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:alterTypeStatement] > ddl:alterable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:alterUserStatement] > ddl:alterable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:alterUserMappingStatement] > ddl:alterable, ddl:statement, postgresddl:userMappingOperand mixin
-[postgresddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
-
-[postgresddl:alterTableStatement] > ddl:alterTableStatement mixin
- - postgresddl:newTableName (STRING)
- - postgresddl:schemaName (STRING)
- + postgresddl:renameColumn (postgresddl:renamedColumn) = postgresddl:renamedColumn multiple
-
-
-// =============================================================================
-// CREATE STATEMENTS
-// =============================================================================
-
-[postgresddl:createAggregateStatement] > ddl:creatable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:createCastStatement] > ddl:creatable, ddl:statement, postgresddl:castOperand mixin
-[postgresddl:createConstraintTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:constraintTriggerOperand mixin
-[postgresddl:createConversionStatement] > ddl:creatable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:createDatabaseStatement] > ddl:creatable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:createForeignDataWrapperStatement] > ddl:creatable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:createFunctionStatement] > ddl:creatable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:createGroupStatement] > ddl:creatable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:createIndexStatement] > ddl:creatable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:createLanguageStatement] > ddl:creatable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:createOperatorStatement] > ddl:creatable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:createRoleStatement] > ddl:creatable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:createRuleStatement] > ddl:creatable, ddl:statement, postgresddl:ruleOperand mixin
-[postgresddl:createSequenceStatement] > ddl:creatable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:createServerStatement] > ddl:creatable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:createTablespaceStatement] > ddl:creatable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:createTextSearchStatement] > ddl:creatable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:createTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:createTypeStatement] > ddl:creatable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:createUserStatement] > ddl:creatable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:createUserMappingStatement] > ddl:creatable, ddl:statement, postgresddl:userMappingOperand mixin
-
-// =============================================================================
-// DROP STATEMENTS
-// =============================================================================
-
-[postgresddl:dropAggregateStatement] > ddl:droppable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:dropCastStatement] > ddl:droppable, ddl:statement, postgresddl:castOperand mixin
-[postgresddl:dropConstraintTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:constraintTriggerOperand mixin
-[postgresddl:dropConversionStatement] > ddl:droppable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:dropForeignDataWrapperStatement] > ddl:droppable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:dropFunctionStatement] > ddl:droppable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:dropGroupStatement] > ddl:droppable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:dropIndexStatement] > ddl:droppable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:dropLanguageStatement] > ddl:droppable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:dropOperatorStatement] > ddl:droppable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:dropOwnedByStatement] > ddl:droppable, ddl:statement, postgresddl:ownedByOperand mixin
-[postgresddl:dropRoleStatement] > ddl:droppable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:dropRuleStatement] > ddl:droppable, ddl:statement, postgresddl:ruleOperand mixin
-[postgresddl:dropSequenceStatement] > ddl:droppable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:dropServerStatement] > ddl:droppable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:dropTextSearchStatement] > ddl:droppable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:dropTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:dropTypeStatement] > ddl:droppable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:dropUserStatement] > ddl:droppable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:dropUserMappingStatement] > ddl:droppable, ddl:statement, postgresddl:userMappingOperand mixin
-
-// =============================================================================
-// MISC STATEMENTS
-// =============================================================================
-
-[postgresddl:abortStatement] > ddl:statement mixin
-[postgresddl:analyzeStatement] > ddl:statement mixin
-[postgresddl:clusterStatement] > ddl:statement mixin
-[postgresddl:commentOnStatement] > ddl:statement, postgresddl:commentOperand mixin
- - postgresddl:targetObjectType (STRING) mandatory
- - postgresddl:targetObjectName (STRING)
- - postgresddl:comment (STRING) mandatory
-[postgresddl:copyStatement] > ddl:statement mixin
-[postgresddl:deallocateStatement] > ddl:statement mixin
-[postgresddl:declareStatement] > ddl:statement mixin
-[postgresddl:discardStatement] > ddl:statement mixin
-[postgresddl:explainStatement] > ddl:statement mixin
-[postgresddl:fetchStatement] > ddl:statement mixin
-[postgresddl:listenStatement] > ddl:statement mixin
-[postgresddl:loadStatement] > ddl:statement mixin
-[postgresddl:lockTableStatement] > ddl:statement mixin
-[postgresddl:moveStatement] > ddl:statement mixin
-[postgresddl:notifyStatement] > ddl:statement mixin
-[postgresddl:prepareStatement] > ddl:statement mixin
-[postgresddl:reassignOwnedStatement] > ddl:statement mixin
-[postgresddl:reindexStatement] > ddl:statement mixin
-[postgresddl:releaseSavepointStatement] > ddl:statement mixin
-[postgresddl:rollbackStatement] > ddl:statement mixin
-[postgresddl:selectIntoStatement] > ddl:statement mixin
-[postgresddl:showStatement] > ddl:statement mixin
-[postgresddl:truncateStatement] > ddl:statement mixin
-[postgresddl:unlistenStatement] > ddl:statement mixin
-[postgresddl:vacuumStatement] > ddl:statement mixin
-
-// =============================================================================
-// GRANT STATEMENTS
-// =============================================================================
-[postgresddl:grantOnTableStatement] > ddl:grantStatement, ddl:tableOperand mixin
-[postgresddl:grantOnSequenceStatement] > ddl:grantStatement, postgresddl:sequenceOperand mixin
-[postgresddl:grantOnDatabaseStatement] > ddl:grantStatement, postgresddl:databaseOperand mixin
-[postgresddl:grantOnForeignDataWrapperStatement] > ddl:grantStatement, postgresddl:foreignDataOperand mixin
-[postgresddl:grantOnForeignServerStatement] > ddl:grantStatement, postgresddl:serverOperand mixin
-[postgresddl:grantOnFunctionStatement] > ddl:grantStatement, postgresddl:functionOperand mixin
- + postgresddl:parameter (postgresddl:functionParameter) = postgresddl:functionParameter multiple
-[postgresddl:grantOnLanguageStatement] > ddl:grantStatement, postgresddl:languageOperand mixin
-[postgresddl:grantOnSchemaStatement] > ddl:grantStatement, ddl:schemaOperand mixin
-[postgresddl:grantOnTablespaceStatement] > ddl:grantStatement, postgresddl:tablespaceOperand mixin
-[postgresddl:grantRolesStatement] > ddl:grantStatement mixin
- + postgresddl:grantRole (postgresddl:role) = postgresddl:role multiple
Modified: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -278,3 +278,18 @@
[ddl:grantOnCollationStatement] > ddl:grantStatement, ddl:collationOperand mixin
[ddl:grantOnCharacterSetStatement] > ddl:grantStatement, ddl:characterSetOperand mixin
[ddl:grantOnTranslationStatement] > ddl:grantStatement, ddl:translationOperand mixin
+
+// =============================================================================
+// REVOKE STATEMENTS
+// =============================================================================
+
+[ddl:revokeStatement] > ddl:statement, ddl:revokable, ddl:droppable mixin
+ - ddl:allPrivileges (boolean)
+ + * (ddl:grantPrivilege) = ddl:grantPrivilege multiple
+ + * (ddl:grantee) = ddl:grantee multiple
+
+[ddl:revokeOnTableStatement] > ddl:revokeStatement, ddl:tableOperand mixin
+[ddl:revokeOnDomainStatement] > ddl:revokeStatement, ddl:domainOperand mixin
+[ddl:revokeOnCollationStatement] > ddl:revokeStatement, ddl:collationOperand mixin
+[ddl:revokeOnCharacterSetStatement] > ddl:revokeStatement, ddl:characterSetOperand mixin
+[ddl:revokeOnTranslationStatement] > ddl:revokeStatement, ddl:translationOperand mixin
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/derby_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/derby_test_statements.ddl 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/derby_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -1,218 +0,0 @@
---
--- SAMPLE DERBY STATEMENTS
---
-
--- Add a new column with a column-level constraint
--- to an existing table
--- An exception will be thrown if the table
--- contains any rows
--- since the newcol will be initialized to NULL
--- in all existing rows in the table
-ALTER TABLE CITIES ADD COLUMN REGION VARCHAR(26)
- CONSTRAINT NEW_CONSTRAINT CHECK (REGION IS NOT NULL);
-
--- Add a new unique constraint to an existing table
--- An exception will be thrown if duplicate keys are found
-ALTER TABLE SAMP.DEPARTMENT
- ADD CONSTRAINT NEW_UNIQUE UNIQUE (DEPTNO);
-
--- add a new foreign key constraint to the
--- Cities table. Each row in Cities is checked
--- to make sure it satisfied the constraints.
--- if any rows don't satisfy the constraint, the
--- constraint is not added
-ALTER TABLE CITIES ADD CONSTRAINT COUNTRY_FK
- Foreign Key (COUNTRY) REFERENCES COUNTRIES (COUNTRY);
-
--- Add a primary key constraint to a table
--- First, create a new table
-CREATE TABLE ACTIVITIES (CITY_ID INT NOT NULL,
- SEASON CHAR(2), ACTIVITY VARCHAR(32) NOT NULL);
--- You will not be able to add this constraint if the
--- columns you are including in the primary key have
--- null data or duplicate values.
-ALTER TABLE Activities ADD PRIMARY KEY (city_id, activity);
-
--- Drop the city_id column if there are no dependent objects:
-ALTER TABLE Cities DROP COLUMN city_id RESTRICT;
--- Drop the city_id column, also dropping all dependent objects:
-ALTER TABLE Cities DROP COLUMN city_id CASCADE;
-
--- Drop a primary key constraint from the CITIES table
-
-ALTER TABLE Cities DROP CONSTRAINT Cities_PK;
--- Drop a foreign key constraint from the CITIES table
-ALTER TABLE Cities DROP CONSTRAINT COUNTRIES_FK;
--- add a DEPTNO column with a default value of 1
-ALTER TABLE SAMP.EMP_ACT ADD COLUMN DEPTNO INT DEFAULT 1;
--- increase the width of a VARCHAR column
-ALTER TABLE SAMP.EMP_PHOTO ALTER PHOTO_FORMAT SET DATA TYPE VARCHAR(30);
--- change the lock granularity of a table
-ALTER TABLE SAMP.SALES LOCKSIZE TABLE;
-
--- Remove the NOT NULL constraint from the MANAGER column
-ALTER TABLE Employees ALTER COLUMN Manager NULL;
--- Add the NOT NULL constraint to the SSN column
-ALTER TABLE Employees ALTER COLUMN ssn NOT NULL;
-
--- Change the default value for the SALARY column
-ALTER TABLE Employees ALTER COLUMN Salary DEFAULT 1000.0;
-ALTER TABLE Employees ALTER COLUMN Salary DROP DEFAULT;
-
-
-CREATE FUNCTION TO_DEGREES
- ( RADIANS DOUBLE )
- RETURNS DOUBLE
- PARAMETER STYLE JAVA
- NO SQL LANGUAGE JAVA
- EXTERNAL NAME 'java.lang.Math.toDegrees';
-
-
-CREATE FUNCTION PROPERTY_FILE_READER
- ( FILENAME VARCHAR( 32672 ) )
- RETURNS TABLE
- (
- KEY_COL VARCHAR( 10 ),
- VALUE_COL VARCHAR( 1000 )
- )
- LANGUAGE JAVA
- PARAMETER STYLE DERBY_JDBC_RESULT_SET
- NO SQL
- EXTERNAL NAME 'vtis.example.PropertyFileVTI.propertyFileVTI';
-
-CREATE INDEX OrigIndex ON Flights(orig_airport);
-
-CREATE INDEX PAY_DESC ON SAMP.EMPLOYEE (SALARY);
-
-CREATE INDEX IXSALE ON SAMP.SALES (SALES);
-
-CREATE PROCEDURE SALES.TOTAL_REVENUE(IN S_MONTH INTEGER,
- IN S_YEAR INTEGER, OUT TOTAL DECIMAL(10,2))
- PARAMETER STYLE JAVA READS SQL DATA LANGUAGE JAVA EXTERNAL NAME
- 'com.acme.sales.calculateRevenueByMonth';
-
-CREATE ROLE purchases_reader;
-
-CREATE ROLE purchases_reader_role;
-
-CREATE SCHEMA FLIGHTS AUTHORIZATION anita;
-
-CREATE SCHEMA EMP;
-
-CREATE SCHEMA AUTHORIZATION takumi;
-
-CREATE SYNONYM SAMP.T1 FOR SAMP.TABLEWITHLONGNAME;
-
-CREATE TABLE HOTELAVAILABILITY
- (HOTEL_ID INT NOT NULL, BOOKING_DATE DATE NOT NULL,
- ROOMS_TAKEN INT DEFAULT 0, PRIMARY KEY (HOTEL_ID, BOOKING_DATE));
-
-CREATE TABLE PEOPLE
- (PERSON_ID INT NOT NULL GENERATED ALWAYS AS IDENTITY
- CONSTRAINT PEOPLE_PK PRIMARY KEY, PERSON VARCHAR(26));
-
-CREATE TABLE greetings
- (i int generated by default as identity (START WITH 2, INCREMENT BY 1), ch char(50));
-
-CREATE TABLE GROUPS
- (GROUP_ID SMALLINT NOT NULL GENERATED ALWAYS AS IDENTITY
- (START WITH 5, INCREMENT BY 5), ADDRESS VARCHAR(100), PHONE VARCHAR(15));
-
-CREATE TRIGGER t1 NO CASCADE BEFORE UPDATE ON x
- FOR EACH ROW MODE DB2SQL
- values app.notifyEmail('Jerry', 'Table x is about to be updated');
-
-
-CREATE TRIGGER FLIGHTSDELETE
- AFTER DELETE ON FLIGHTS
- REFERENCING OLD_TABLE AS DELETEDFLIGHTS
- FOR EACH STATEMENT
- DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID IN
- (SELECT FLIGHT_ID FROM DELETEDFLIGHTS);
-
-CREATE TRIGGER FLIGHTSDELETE3
- AFTER DELETE ON FLIGHTS
- REFERENCING OLD AS OLD
- FOR EACH ROW
- DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID = OLD.FLIGHT_ID;
-
-
-
-CREATE VIEW SAMP.V1 (COL_SUM, COL_DIFF)
- AS SELECT COMM + BONUS, COMM - BONUS
- FROM SAMP.EMPLOYEE;
-
-CREATE VIEW SAMP.VEMP_RES (RESUME)
- AS VALUES 'Delores M. Quintana', 'Heather A. Nicholls', 'Bruce Adamson';
-
-CREATE VIEW SAMP.PROJ_COMBO
- (PROJNO, PRENDATE, PRSTAFF, MAJPROJ)
- AS SELECT PROJNO, PRENDATE, PRSTAFF, MAJPROJ
- FROM SAMP.PROJECT UNION ALL
- SELECT PROJNO, EMSTDATE, EMPTIME, EMPNO
- FROM SAMP.EMP_ACT
- WHERE EMPNO IS NOT NULL;
-
-CREATE VIEW V1 (C1) AS SELECT SIN(C1) FROM T1;
-
-declare global temporary table SESSION.t1(c11 int) not logged;
--- The SESSION qualification is redundant here because temporary
--- tables can only exist in the SESSION schema.
-
-declare global temporary table t2(c21 int) not logged;
--- The temporary table is not qualified here with SESSION because temporary
--- tables can only exist in the SESSION schema.
-
-DROP FUNCTION some_function_name;
-
-DROP INDEX OrigIndex;
-
-DROP INDEX DestIndex;
-
-DROP PROCEDURE some_procedure_name;
-
-DROP ROLE reader;
-
--- The RESTRICT keyword is required
-DROP SCHEMA SAMP RESTRICT;
-
-DROP SYNONYM some_synonym_name;
-
-DROP TABLE some_table_name;
-
-DROP TRIGGER TRIG1;
-
-DROP VIEW AnIdentifier;
-
-GRANT SELECT ON TABLE t TO maria,harry;
-
-GRANT UPDATE, TRIGGER ON TABLE t TO anita,zhi;
-
-GRANT SELECT ON TABLE s.v to PUBLIC;
-
-GRANT EXECUTE ON PROCEDURE p TO george;
-
-GRANT purchases_reader_role TO george,maria;
-
-GRANT SELECT ON TABLE t TO purchases_reader_role;
-
-INSERT INTO COUNTRIES
- VALUES ('Taiwan', 'TW', 'Asia');
-
-INSERT INTO MA_EMP_ACT
- SELECT * FROM EMP_ACT
- WHERE SUBSTR(PROJNO, 1, 2) = 'MA';
-
--- Insert the DEFAULT value for the LOCATION column
-INSERT INTO DEPARTMENT
- VALUES ('E31', 'ARCHITECTURE', '00390', 'E01', DEFAULT);
-
-LOCK TABLE FlightAvailability IN EXCLUSIVE MODE;
-
-LOCK TABLE Maps IN EXCLUSIVE MODE;
-
-RENAME INDEX DESTINDEX TO ARRIVALINDEX;
-
-RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT;
-
-
Copied: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdl.cnd (from rev 1528, trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd)
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdl.cnd (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdl.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,102 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+
+ //------------------------------------------------------------------------------
+// N A M E S P A C E S
+//------------------------------------------------------------------------------
+<jcr='http://www.jcp.org/jcr/1.0'>
+<nt='http://www.jcp.org/jcr/nt/1.0'>
+<mix='http://www.jcp.org/jcr/mix/1.0'>
+<ddl='http://www.jboss.org/dna/ddl/1.0'>
+<derbyddl='http://www.jboss.org/dna/ddl/derby/1.0'>
+
+// =============================================================================
+// OPERANDS
+// =============================================================================
+[derbyddl:functionOperand] > ddl:operand abstract
+[derbyddl:indexOperand] > ddl:operand abstract
+[derbyddl:procedureOperand] > ddl:operand abstract
+[derbyddl:roleOperand] > ddl:operand abstract
+[derbyddl:synonymOperand] > ddl:operand abstract
+[derbyddl:triggerOperand] > ddl:operand abstract
+
+[derbyddl:roleName] > derbyddl:roleOperand mixin
+
+// =============================================================================
+// COLUMN
+// =============================================================================
+[derbyddl:columnDefinition] > ddl:columnDefinition mixin
+ - derbyddl:dropDefault (boolean)
+
+[derbyddl:functionParameter] > ddl:columnDefinition mixin
+
+[derbyddl:indexColumnReference] > ddl:columnReference mixin
+ - derbyddl:order (STRING)
+
+// =============================================================================
+// CREATE STATEMENTS
+// =============================================================================
+[derbyddl:createFunctionStatement] > ddl:creatable, ddl:statement, derbyddl:functionOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:isTableType (boolean)
+ + * (derbyddl:functionParameter) = derbyddl:functionParameter multiple
+ + * (ddl:statementOption) = ddl:statementOption multiple
+[derbyddl:createIndexStatement] > ddl:statement, ddl:creatable, derbyddl:indexOperand mixin
+ - derbyddl:tableName (string) mandatory
+ - derbyddl:unique (boolean)
+ + * (derbyddl:indexColumnReference) = derbyddl:indexColumnReference multiple
+[derbyddl:createProcedureStatement] > ddl:creatable, ddl:statement, derbyddl:procedureOperand mixin
+[derbyddl:createRoleStatement] > ddl:creatable, ddl:statement, derbyddl:roleOperand mixin
+[derbyddl:createSynonymStatement] > ddl:creatable, ddl:statement, derbyddl:synonymOperand mixin
+ - derbyddl:tableName (string) mandatory
+[derbyddl:createTriggerStatement] > ddl:creatable, ddl:statement, derbyddl:triggerOperand mixin
+ - derbyddl:tableName (string) mandatory
+ - ddl:sql (string) mandatory
+ + * (ddl:columnReference) = ddl:columnreference multiple
+[derbyddl:declareGlobalTemporaryTableStatement] > ddl:createTableStatement mixin
+
+// =============================================================================
+// DROP STATEMENTS
+// =============================================================================
+[derbyddl:dropFunctionStatement] > ddl:droppable, derbyddl:functionOperand mixin
+[derbyddl:dropIndexStatement] > ddl:droppable, derbyddl:indexOperand mixin
+[derbyddl:dropProcedureStatement] > ddl:droppable, derbyddl:procedureOperand mixin
+[derbyddl:dropRoleStatement] > ddl:droppable, derbyddl:roleOperand mixin
+[derbyddl:dropSynonymStatement] > ddl:droppable, derbyddl:synonymOperand mixin
+[derbyddl:dropTriggerStatement] > ddl:droppable, derbyddl:triggerOperand mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+[derbyddl:lockTableStatement] > ddl:statement, ddl:tableOperand mixin
+[derbyddl:renameTableStatement] > ddl:statement, ddl:renamable, ddl:tableOperand mixin
+
+[derbyddl:grantOnFunctionStatement] > ddl:grantStatement, derbyddl:functionOperand mixin
+[derbyddl:grantOnProcedureStatement] > ddl:grantStatement, derbyddl:procedureOperand mixin
+
+[derbyddl:grantRolesStatement] > ddl:grantStatement mixin
+ + ddl:name (derbyddl:roleName) = derbyddl:roleName multiple
\ No newline at end of file
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/DerbyDdl.cnd
___________________________________________________________________
Name: svn:executable
+ *
Copied: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/derby_test_statements.ddl (from rev 1494, trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/derby_test_statements.ddl)
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/derby_test_statements.ddl (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/derby_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,218 @@
+--
+-- SAMPLE DERBY STATEMENTS
+--
+
+-- Add a new column with a column-level constraint
+-- to an existing table
+-- An exception will be thrown if the table
+-- contains any rows
+-- since the newcol will be initialized to NULL
+-- in all existing rows in the table
+ALTER TABLE CITIES ADD COLUMN REGION VARCHAR(26)
+ CONSTRAINT NEW_CONSTRAINT CHECK (REGION IS NOT NULL);
+
+-- Add a new unique constraint to an existing table
+-- An exception will be thrown if duplicate keys are found
+ALTER TABLE SAMP.DEPARTMENT
+ ADD CONSTRAINT NEW_UNIQUE UNIQUE (DEPTNO);
+
+-- add a new foreign key constraint to the
+-- Cities table. Each row in Cities is checked
+-- to make sure it satisfied the constraints.
+-- if any rows don't satisfy the constraint, the
+-- constraint is not added
+ALTER TABLE CITIES ADD CONSTRAINT COUNTRY_FK
+ Foreign Key (COUNTRY) REFERENCES COUNTRIES (COUNTRY);
+
+-- Add a primary key constraint to a table
+-- First, create a new table
+CREATE TABLE ACTIVITIES (CITY_ID INT NOT NULL,
+ SEASON CHAR(2), ACTIVITY VARCHAR(32) NOT NULL);
+-- You will not be able to add this constraint if the
+-- columns you are including in the primary key have
+-- null data or duplicate values.
+ALTER TABLE Activities ADD PRIMARY KEY (city_id, activity);
+
+-- Drop the city_id column if there are no dependent objects:
+ALTER TABLE Cities DROP COLUMN city_id RESTRICT;
+-- Drop the city_id column, also dropping all dependent objects:
+ALTER TABLE Cities DROP COLUMN city_id CASCADE;
+
+-- Drop a primary key constraint from the CITIES table
+
+ALTER TABLE Cities DROP CONSTRAINT Cities_PK;
+-- Drop a foreign key constraint from the CITIES table
+ALTER TABLE Cities DROP CONSTRAINT COUNTRIES_FK;
+-- add a DEPTNO column with a default value of 1
+ALTER TABLE SAMP.EMP_ACT ADD COLUMN DEPTNO INT DEFAULT 1;
+-- increase the width of a VARCHAR column
+ALTER TABLE SAMP.EMP_PHOTO ALTER PHOTO_FORMAT SET DATA TYPE VARCHAR(30);
+-- change the lock granularity of a table
+ALTER TABLE SAMP.SALES LOCKSIZE TABLE;
+
+-- Remove the NOT NULL constraint from the MANAGER column
+ALTER TABLE Employees ALTER COLUMN Manager NULL;
+-- Add the NOT NULL constraint to the SSN column
+ALTER TABLE Employees ALTER COLUMN ssn NOT NULL;
+
+-- Change the default value for the SALARY column
+ALTER TABLE Employees ALTER COLUMN Salary DEFAULT 1000.0;
+ALTER TABLE Employees ALTER COLUMN Salary DROP DEFAULT;
+
+
+CREATE FUNCTION TO_DEGREES
+ ( RADIANS DOUBLE )
+ RETURNS DOUBLE
+ PARAMETER STYLE JAVA
+ NO SQL LANGUAGE JAVA
+ EXTERNAL NAME 'java.lang.Math.toDegrees';
+
+
+CREATE FUNCTION PROPERTY_FILE_READER
+ ( FILENAME VARCHAR( 32672 ) )
+ RETURNS TABLE
+ (
+ KEY_COL VARCHAR( 10 ),
+ VALUE_COL VARCHAR( 1000 )
+ )
+ LANGUAGE JAVA
+ PARAMETER STYLE DERBY_JDBC_RESULT_SET
+ NO SQL
+ EXTERNAL NAME 'vtis.example.PropertyFileVTI.propertyFileVTI';
+
+CREATE INDEX OrigIndex ON Flights(orig_airport);
+
+CREATE INDEX PAY_DESC ON SAMP.EMPLOYEE (SALARY);
+
+CREATE INDEX IXSALE ON SAMP.SALES (SALES);
+
+CREATE PROCEDURE SALES.TOTAL_REVENUE(IN S_MONTH INTEGER,
+ IN S_YEAR INTEGER, OUT TOTAL DECIMAL(10,2))
+ PARAMETER STYLE JAVA READS SQL DATA LANGUAGE JAVA EXTERNAL NAME
+ 'com.acme.sales.calculateRevenueByMonth';
+
+CREATE ROLE purchases_reader;
+
+CREATE ROLE purchases_reader_role;
+
+CREATE SCHEMA FLIGHTS AUTHORIZATION anita;
+
+CREATE SCHEMA EMP;
+
+CREATE SCHEMA AUTHORIZATION takumi;
+
+CREATE SYNONYM SAMP.T1 FOR SAMP.TABLEWITHLONGNAME;
+
+CREATE TABLE HOTELAVAILABILITY
+ (HOTEL_ID INT NOT NULL, BOOKING_DATE DATE NOT NULL,
+ ROOMS_TAKEN INT DEFAULT 0, PRIMARY KEY (HOTEL_ID, BOOKING_DATE));
+
+CREATE TABLE PEOPLE
+ (PERSON_ID INT NOT NULL GENERATED ALWAYS AS IDENTITY
+ CONSTRAINT PEOPLE_PK PRIMARY KEY, PERSON VARCHAR(26));
+
+CREATE TABLE greetings
+ (i int generated by default as identity (START WITH 2, INCREMENT BY 1), ch char(50));
+
+CREATE TABLE GROUPS
+ (GROUP_ID SMALLINT NOT NULL GENERATED ALWAYS AS IDENTITY
+ (START WITH 5, INCREMENT BY 5), ADDRESS VARCHAR(100), PHONE VARCHAR(15));
+
+CREATE TRIGGER t1 NO CASCADE BEFORE UPDATE ON x
+ FOR EACH ROW MODE DB2SQL
+ values app.notifyEmail('Jerry', 'Table x is about to be updated');
+
+
+CREATE TRIGGER FLIGHTSDELETE
+ AFTER DELETE ON FLIGHTS
+ REFERENCING OLD_TABLE AS DELETEDFLIGHTS
+ FOR EACH STATEMENT
+ DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID IN
+ (SELECT FLIGHT_ID FROM DELETEDFLIGHTS);
+
+CREATE TRIGGER FLIGHTSDELETE3
+ AFTER DELETE ON FLIGHTS
+ REFERENCING OLD AS OLD
+ FOR EACH ROW
+ DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID = OLD.FLIGHT_ID;
+
+
+
+CREATE VIEW SAMP.V1 (COL_SUM, COL_DIFF)
+ AS SELECT COMM + BONUS, COMM - BONUS
+ FROM SAMP.EMPLOYEE;
+
+CREATE VIEW SAMP.VEMP_RES (RESUME)
+ AS VALUES 'Delores M. Quintana', 'Heather A. Nicholls', 'Bruce Adamson';
+
+CREATE VIEW SAMP.PROJ_COMBO
+ (PROJNO, PRENDATE, PRSTAFF, MAJPROJ)
+ AS SELECT PROJNO, PRENDATE, PRSTAFF, MAJPROJ
+ FROM SAMP.PROJECT UNION ALL
+ SELECT PROJNO, EMSTDATE, EMPTIME, EMPNO
+ FROM SAMP.EMP_ACT
+ WHERE EMPNO IS NOT NULL;
+
+CREATE VIEW V1 (C1) AS SELECT SIN(C1) FROM T1;
+
+declare global temporary table SESSION.t1(c11 int) not logged;
+-- The SESSION qualification is redundant here because temporary
+-- tables can only exist in the SESSION schema.
+
+declare global temporary table t2(c21 int) not logged;
+-- The temporary table is not qualified here with SESSION because temporary
+-- tables can only exist in the SESSION schema.
+
+DROP FUNCTION some_function_name;
+
+DROP INDEX OrigIndex;
+
+DROP INDEX DestIndex;
+
+DROP PROCEDURE some_procedure_name;
+
+DROP ROLE reader;
+
+-- The RESTRICT keyword is required
+DROP SCHEMA SAMP RESTRICT;
+
+DROP SYNONYM some_synonym_name;
+
+DROP TABLE some_table_name;
+
+DROP TRIGGER TRIG1;
+
+DROP VIEW AnIdentifier;
+
+GRANT SELECT ON TABLE t TO maria,harry;
+
+GRANT UPDATE, TRIGGER ON TABLE t TO anita,zhi;
+
+GRANT SELECT ON TABLE s.v to PUBLIC;
+
+GRANT EXECUTE ON PROCEDURE p TO george;
+
+GRANT purchases_reader_role TO george,maria;
+
+GRANT SELECT ON TABLE t TO purchases_reader_role;
+
+INSERT INTO COUNTRIES
+ VALUES ('Taiwan', 'TW', 'Asia');
+
+INSERT INTO MA_EMP_ACT
+ SELECT * FROM EMP_ACT
+ WHERE SUBSTR(PROJNO, 1, 2) = 'MA';
+
+-- Insert the DEFAULT value for the LOCATION column
+INSERT INTO DEPARTMENT
+ VALUES ('E31', 'ARCHITECTURE', '00390', 'E01', DEFAULT);
+
+LOCK TABLE FlightAvailability IN EXCLUSIVE MODE;
+
+LOCK TABLE Maps IN EXCLUSIVE MODE;
+
+RENAME INDEX DESTINDEX TO ARRIVALINDEX;
+
+RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT;
+
+
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/derby/derby_test_statements.ddl
___________________________________________________________________
Name: svn:executable
+ *
Copied: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdl.cnd (from rev 1494, trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/OracleDdl.cnd)
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdl.cnd (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdl.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,202 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+
+ //------------------------------------------------------------------------------
+// N A M E S P A C E S
+//------------------------------------------------------------------------------
+<jcr='http://www.jcp.org/jcr/1.0'>
+<nt='http://www.jcp.org/jcr/nt/1.0'>
+<mix='http://www.jcp.org/jcr/mix/1.0'>
+<ddl='http://www.jboss.org/dna/ddl/1.0'>
+<oracleddl='http://www.jboss.org/dna/ddl/oracle/1.0'>
+
+// =============================================================================
+// OPERANDS
+// =============================================================================
+[oracleddl:clusterOperand] > ddl:operand abstract
+[oracleddl:commentOperand] > ddl:operand abstract
+[oracleddl:contextOperand] > ddl:operand abstract
+[oracleddl:controlfileOperand] > ddl:operand abstract
+[oracleddl:databaseOperand] > ddl:operand abstract
+[oracleddl:dimensionOperand] > ddl:operand abstract
+[oracleddl:directoryOperand] > ddl:operand abstract
+[oracleddl:diskgroupOperand] > ddl:operand abstract
+[oracleddl:functionOperand] > ddl:operand abstract
+[oracleddl:indexOperand] > ddl:operand abstract
+[oracleddl:indextypeOperand] > ddl:operand abstract
+[oracleddl:javaOperand] > ddl:operand abstract
+[oracleddl:libraryOperand] > ddl:operand abstract
+[oracleddl:materializedOperand] > ddl:operand abstract
+[oracleddl:operatorOperand] > ddl:operand abstract
+[oracleddl:outlineOperand] > ddl:operand abstract
+[oracleddl:packageOperand] > ddl:operand abstract
+[oracleddl:pfileOperand] > ddl:operand abstract
+[oracleddl:procedureOperand] > ddl:operand abstract
+[oracleddl:profileOperand] > ddl:operand abstract
+[oracleddl:resourceOperand] > ddl:operand abstract
+[oracleddl:roleOperand] > ddl:operand abstract
+[oracleddl:rollbackOperand] > ddl:operand abstract
+[oracleddl:sequenceOperand] > ddl:operand abstract
+[oracleddl:sessionOperand] > ddl:operand abstract
+[oracleddl:spfileOperand] > ddl:operand abstract
+[oracleddl:systemOperand] > ddl:operand abstract
+[oracleddl:synonymOperand] > ddl:operand abstract
+[oracleddl:tablespaceOperand] > ddl:operand abstract
+[oracleddl:triggerOperand] > ddl:operand abstract
+[oracleddl:typeOperand] > ddl:operand abstract
+[oracleddl:userOperand] > ddl:operand abstract
+
+// =============================================================================
+// COLUMN
+// =============================================================================
+[oracleddl:columnDefinition] > ddl:columnDefinition
+ - oracleddl:dropDefault (boolean)
+
+// =============================================================================
+// ALTER STATEMENTS
+// =============================================================================
+[oracleddl:alterClusterStatement] > ddl:alterable, ddl:statement, oracleddl:clusterOperand mixin
+[oracleddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, oracleddl:databaseOperand mixin
+[oracleddl:alterDimensionStatement] > ddl:alterable, ddl:statement, oracleddl:dimensionOperand mixin
+[oracleddl:alterDiskgroupStatement] > ddl:alterable, ddl:statement, oracleddl:diskgroupOperand mixin
+[oracleddl:alterFunctionStatement] > ddl:alterable, ddl:statement, oracleddl:functionOperand mixin
+[oracleddl:alterIndexStatement] > ddl:alterable, ddl:statement, oracleddl:indexOperand mixin
+[oracleddl:alterIndextypeStatement] > ddl:alterable, ddl:statement, oracleddl:indextypeOperand mixin
+[oracleddl:alterJavaStatement] > ddl:alterable, ddl:statement, oracleddl:javaOperand mixin
+[oracleddl:alterMaterializedStatement] > ddl:alterable, ddl:statement, oracleddl:materializedOperand mixin
+[oracleddl:alterOperatorStatement] > ddl:alterable, ddl:statement, oracleddl:operatorOperand mixin
+[oracleddl:alterOutlineStatement] > ddl:alterable, ddl:statement, oracleddl:outlineOperand mixin
+[oracleddl:alterPackageStatement] > ddl:alterable, ddl:statement, oracleddl:packageOperand mixin
+[oracleddl:alterProcedureStatement] > ddl:alterable, ddl:statement, oracleddl:procedureOperand mixin
+[oracleddl:alterProfileStatement] > ddl:alterable, ddl:statement, oracleddl:profileOperand mixin
+[oracleddl:alterResourceStatement] > ddl:alterable, ddl:statement, oracleddl:resourceOperand mixin
+[oracleddl:alterRoleStatement] > ddl:alterable, ddl:statement, oracleddl:roleOperand mixin
+[oracleddl:alterRollbackStatement] > ddl:alterable, ddl:statement, oracleddl:rollbackOperand mixin
+[oracleddl:alterSequenceStatement] > ddl:alterable, ddl:statement, oracleddl:sequenceOperand mixin
+[oracleddl:alterSessionStatement] > ddl:alterable, ddl:statement, oracleddl:sessionOperand mixin
+[oracleddl:alterSystemStatement] > ddl:alterable, ddl:statement, oracleddl:systemOperand mixin
+[oracleddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, oracleddl:tablespaceOperand mixin
+[oracleddl:alterTriggerStatement] > ddl:alterable, ddl:statement, oracleddl:triggerOperand mixin
+[oracleddl:alterTypeStatement] > ddl:alterable, ddl:statement, oracleddl:typeOperand mixin
+[oracleddl:alterUserStatement] > ddl:alterable, ddl:statement, oracleddl:userOperand mixin
+[oracleddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
+
+[oracleddl:alterTableStatement] > ddl:alterTableStatement mixin
+ - oracleddl:newTableName (STRING)
+ + oracleddl:renameColumn (ddl:renamable) = ddl:renamable multiple
+ + oracleddl:renameConstraint (ddl:renamable) = ddl:renamable multiple
+
+// =============================================================================
+// CREATE STATEMENTS
+// =============================================================================
+
+[oracleddl:createClusterStatement] > ddl:creatable, ddl:statement, oracleddl:clusterOperand mixin
+[oracleddl:createContextStatement] > ddl:creatable, ddl:statement, oracleddl:contextOperand mixin
+[oracleddl:createControlfileStatement] > ddl:creatable, ddl:statement, oracleddl:controlfileOperand mixin
+[oracleddl:createDatabaseStatement] > ddl:creatable, ddl:statement, oracleddl:databaseOperand mixin
+[oracleddl:createDimensionStatement] > ddl:creatable, ddl:statement, oracleddl:dimensionOperand mixin
+[oracleddl:createDirectoryStatement] > ddl:creatable, ddl:statement, oracleddl:directoryOperand mixin
+[oracleddl:createDiskgroupStatement] > ddl:creatable, ddl:statement, oracleddl:diskgroupOperand mixin
+[oracleddl:createFunctionStatement] > ddl:creatable, ddl:statement, oracleddl:functionOperand mixin
+[oracleddl:createIndexStatement] > ddl:creatable, ddl:statement, oracleddl:indexOperand mixin
+ - oracleddl:tableName (string) mandatory
+ - oracleddl:unique (boolean)
+ - oracleddl:bitmap (boolean)
+ + * (ddl:columnReference) = ddl:columnReference multiple
+[oracleddl:createIndexTypeStatement] > ddl:creatable, ddl:statement, oracleddl:indextypeOperand mixin
+[oracleddl:createJavaStatement] > ddl:creatable, ddl:statement, oracleddl:javaOperand mixin
+[oracleddl:createLibraryStatement] > ddl:creatable, ddl:statement, oracleddl:libraryOperand mixin
+[oracleddl:createMaterializedStatement] > ddl:creatable, ddl:statement, oracleddl:materializedOperand mixin
+[oracleddl:createOperatorStatement] > ddl:creatable, ddl:statement, oracleddl:operatorOperand mixin
+[oracleddl:createOutlineStatement] > ddl:creatable, ddl:statement, oracleddl:outlineOperand mixin
+[oracleddl:createPackageStatement] > ddl:creatable, ddl:statement, oracleddl:packageOperand mixin
+[oracleddl:createPfileStatement] > ddl:creatable, ddl:statement, oracleddl:pfileOperand mixin
+[oracleddl:createProcedureStatement] > ddl:creatable, ddl:statement, oracleddl:procedureOperand mixin
+[oracleddl:createRoleStatement] > ddl:creatable, ddl:statement, oracleddl:roleOperand mixin
+[oracleddl:createRollbackStatement] > ddl:creatable, ddl:statement, oracleddl:rollbackOperand mixin
+[oracleddl:createSequenceStatement] > ddl:creatable, ddl:statement, oracleddl:sequenceOperand mixin
+[oracleddl:createSpfileStatement] > ddl:creatable, ddl:statement, oracleddl:spfileOperand mixin
+[oracleddl:createSynonymStatement] > ddl:creatable, ddl:statement, oracleddl:synonymOperand mixin
+[oracleddl:createTablespaceStatement] > ddl:creatable, ddl:statement, oracleddl:tablespaceOperand mixin
+[oracleddl:createTriggerStatement] > ddl:creatable, ddl:statement, oracleddl:triggerOperand mixin
+[oracleddl:createTypeStatement] > ddl:creatable, ddl:statement, oracleddl:typeOperand mixin
+[oracleddl:createUserStatement] > ddl:creatable, ddl:statement, oracleddl:userOperand mixin
+
+// =============================================================================
+// DROP STATEMENTS
+// =============================================================================
+
+[oracleddl:dropClusterStatement] > ddl:droppable, ddl:statement, oracleddl:clusterOperand mixin
+[oracleddl:dropContextStatement] > ddl:droppable, ddl:statement, oracleddl:contextOperand mixin
+[oracleddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, oracleddl:databaseOperand mixin
+[oracleddl:dropDimensionStatement] > ddl:droppable, ddl:statement, oracleddl:dimensionOperand mixin
+[oracleddl:dropDirectoryStatement] > ddl:droppable, ddl:statement, oracleddl:directoryOperand mixin
+[oracleddl:dropDiskgroupStatement] > ddl:droppable, ddl:statement, oracleddl:diskgroupOperand mixin
+[oracleddl:dropFunctionStatement] > ddl:droppable, ddl:statement, oracleddl:functionOperand mixin
+[oracleddl:dropIndexStatement] > ddl:droppable, ddl:statement, oracleddl:indexOperand mixin
+[oracleddl:dropIndextypeStatement] > ddl:droppable, ddl:statement, oracleddl:indextypeOperand mixin
+[oracleddl:dropJavaStatement] > ddl:droppable, ddl:statement, oracleddl:javaOperand mixin
+[oracleddl:dropLibraryStatement] > ddl:droppable, ddl:statement, oracleddl:libraryOperand mixin
+[oracleddl:dropMaterializedStatement] > ddl:droppable, ddl:statement, oracleddl:materializedOperand mixin
+[oracleddl:dropOperatorStatement] > ddl:droppable, ddl:statement, oracleddl:operatorOperand mixin
+[oracleddl:dropOutlineStatement] > ddl:droppable, ddl:statement, oracleddl:outlineOperand mixin
+[oracleddl:dropPackageStatement] > ddl:droppable, ddl:statement, oracleddl:packageOperand mixin
+[oracleddl:dropProcedureStatement] > ddl:droppable, ddl:statement, oracleddl:procedureOperand mixin
+[oracleddl:dropProfileStatement] > ddl:droppable, ddl:statement, oracleddl:profileOperand mixin
+[oracleddl:dropRoleStatement] > ddl:droppable, ddl:statement, oracleddl:roleOperand mixin
+[oracleddl:dropRollbackStatement] > ddl:droppable, ddl:statement, oracleddl:rollbackOperand mixin
+[oracleddl:dropSequenceStatement] > ddl:droppable, ddl:statement, oracleddl:sequenceOperand mixin
+[oracleddl:dropSynonymStatement] > ddl:droppable, ddl:statement, oracleddl:synonymOperand mixin
+[oracleddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, oracleddl:tablespaceOperand mixin
+[oracleddl:dropTriggerStatement] > ddl:droppable, ddl:statement, oracleddl:triggerOperand mixin
+[oracleddl:dropTypeStatement] > ddl:droppable, ddl:statement, oracleddl:typeOperand mixin
+[oracleddl:dropUserStatement] > ddl:droppable, ddl:statement, oracleddl:userOperand mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+
+[oracleddl:analyzeStatement] > ddl:statement mixin
+[oracleddl:associateStatisticsStatement] > ddl:statement mixin
+[oracleddl:auditStatement] > ddl:statement mixin
+[oracleddl:commitStatement] > ddl:statement mixin
+[oracleddl:commentOnStatement] > ddl:statement, oracleddl:commentOperand mixin
+ - oracleddl:targetObjectType (STRING) mandatory
+ - oracleddl:targetObjectName (STRING)
+ - oracleddl:comment (STRING) mandatory
+[oracleddl:disassociateStatisticsStatement] > ddl:statement mixin
+[oracleddl:explainPlanStatement] > ddl:statement mixin
+[oracleddl:flashbackStatement] > ddl:statement mixin
+[oracleddl:lockTableStatement] > ddl:statement mixin
+[oracleddl:mergeStatement] > ddl:statement mixin
+[oracleddl:nestedTableStatement] > ddl:statement mixin
+[oracleddl:noauditStatement] > ddl:statement mixin
+[oracleddl:purgeStatement] > ddl:statement mixin
+[oracleddl:renameStatement] > ddl:statement mixin
+[oracleddl:revokeStatement] > ddl:statement mixin
+[oracleddl:rollbackStatement] > ddl:statement mixin
+[oracleddl:setConstraintsStatement] > ddl:statement, ddl:settable mixin
+[oracleddl:setRoleStatement] > ddl:statement, ddl:settable mixin
+[oracleddl:setTransactionStatement] > ddl:statement, ddl:settable mixin
+[oracleddl:truncateStatement] > ddl:statement mixin
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/OracleDdl.cnd
___________________________________________________________________
Name: svn:executable
+ *
Copied: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/oracle_test_statements.ddl (from rev 1494, trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/oracle_test_statements.ddl)
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/oracle_test_statements.ddl (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/oracle_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,234 @@
+--
+-- SAMPLE ORACLE STATEMENTS
+--
+
+ALTER TABLE employees
+ PCTFREE 30
+ PCTUSED 60;
+
+ALTER TABLE countries
+ ADD (duty_pct NUMBER(2,2) CHECK (duty_pct < 10.5),
+ visa_needed VARCHAR2(3));
+
+ALTER TABLESPACE tbs_01
+ BEGIN BACKUP;
+
+ALTER TABLESPACE omf_ts1 ADD DATAFILE;
+
+ALTER TABLESPACE undots1
+ RETENTION NOGUARANTEE;
+
+ALTER TRIGGER update_job_history DISABLE;
+
+ALTER TYPE data_typ
+ ADD MEMBER FUNCTION qtr(der_qtr DATE)
+ RETURN CHAR CASCADE;
+
+ALTER TYPE cust_address_typ
+ ADD ATTRIBUTE (phone phone_list_typ) CASCADE;
+
+ALTER TYPE phone_list_typ
+ MODIFY ELEMENT TYPE VARCHAR(64) CASCADE;
+
+ALTER USER app_user1
+ GRANT CONNECT THROUGH sh
+ WITH ROLE warehouse_user;
+
+-- 10 Statements
+
+ALTER USER app_user1 IDENTIFIED GLOBALLY AS 'CN=tom,O=oracle,C=US';
+
+ALTER USER sidney
+ IDENTIFIED BY second_2nd_pwd
+ DEFAULT TABLESPACE example;
+
+ALTER VIEW customer_ro
+ COMPILE;
+
+ANALYZE TABLE customers VALIDATE STRUCTURE ONLINE;
+
+ANALYZE TABLE employees VALIDATE STRUCTURE CASCADE;
+
+ANALYZE TABLE orders DELETE STATISTICS;
+
+ASSOCIATE STATISTICS WITH PACKAGES emp_mgmt DEFAULT SELECTIVITY 10;
+
+AUDIT SELECT
+ ON hr.employees
+ WHENEVER SUCCESSFUL;
+
+AUDIT INSERT, UPDATE
+ ON oe.customers;
+
+AUDIT DELETE ANY TABLE;
+
+-- 20 Statements
+
+AUDIT ROLE
+ WHENEVER SUCCESSFUL;
+
+COMMENT ON COLUMN employees.job_id
+ IS 'abbreviated job title';
+
+COMMIT WORK;
+
+COMMIT COMMENT 'In-doubt transaction Code 36, Call (415) 555-2637';
+
+CREATE CLUSTER personnel
+ (department NUMBER(4))
+SIZE 512
+STORAGE (initial 100K next 50K);
+
+CREATE CLUSTER address
+ (postal_code NUMBER, country_id CHAR(2))
+ HASHKEYS 20
+ HASH IS MOD(postal_code + country_id, 101);
+
+CREATE CLUSTER cust_orders (customer_id NUMBER(6))
+ SIZE 512 SINGLE TABLE HASHKEYS 100;
+
+CREATE CONTEXT hr_context USING emp_mgmt;
+
+CREATE CONTROLFILE REUSE DATABASE "demo" NORESETLOGS NOARCHIVELOG
+ MAXLOGFILES 32
+ MAXLOGMEMBERS 2
+ MAXDATAFILES 32
+ MAXINSTANCES 1
+ MAXLOGHISTORY 449
+ LOGFILE
+ GROUP 1 '/path/oracle/dbs/t_log1.f' SIZE 500K,
+ GROUP 2 '/path/oracle/dbs/t_log2.f' SIZE 500K
+ # STANDBY LOGFILE
+ DATAFILE
+ '/path/oracle/dbs/t_db1.f',
+ '/path/oracle/dbs/dbu19i.dbf',
+ '/path/oracle/dbs/tbs_11.f',
+ '/path/oracle/dbs/smundo.dbf',
+ '/path/oracle/dbs/demo.dbf'
+ CHARACTER SET WE8DEC
+ ;
+
+CREATE DATABASE sample
+ CONTROLFILE REUSE
+ LOGFILE
+ GROUP 1 ('diskx:log1.log', 'disky:log1.log') SIZE 50K,
+ GROUP 2 ('diskx:log2.log', 'disky:log2.log') SIZE 50K
+ MAXLOGFILES 5
+ MAXLOGHISTORY 100
+ MAXDATAFILES 10
+ MAXINSTANCES 2
+ ARCHIVELOG
+ CHARACTER SET AL32UTF8
+ NATIONAL CHARACTER SET AL16UTF16
+ DATAFILE
+ 'disk1:df1.dbf' AUTOEXTEND ON,
+ 'disk2:df2.dbf' AUTOEXTEND ON NEXT 10M MAXSIZE UNLIMITED
+ DEFAULT TEMPORARY TABLESPACE temp_ts
+ UNDO TABLESPACE undo_ts
+ SET TIME_ZONE = '+02:00';
+
+-- 30 Statements
+
+CREATE PUBLIC DATABASE LINK remote
+ USING 'remote';
+
+CREATE DATABASE LINK local
+ CONNECT TO hr IDENTIFIED BY hr
+ USING 'local';
+
+CREATE DIMENSION customers_dim
+ LEVEL customer IS (customers.cust_id)
+ LEVEL city IS (customers.cust_city)
+ LEVEL state IS (customers.cust_state_province)
+ LEVEL country IS (countries.country_id)
+ LEVEL subregion IS (countries.country_subregion)
+ LEVEL region IS (countries.country_region)
+ HIERARCHY geog_rollup (
+ customer CHILD OF
+ city CHILD OF
+ state CHILD OF
+ country CHILD OF
+ subregion CHILD OF
+ region
+ JOIN KEY (customers.country_id) REFERENCES country
+ )
+ ATTRIBUTE customer DETERMINES
+ (cust_first_name, cust_last_name, cust_gender,
+ cust_marital_status, cust_year_of_birth,
+ cust_income_level, cust_credit_limit)
+ ATTRIBUTE country DETERMINES (countries.country_name)
+;
+
+CREATE DIRECTORY admin AS 'oracle/admin';
+
+CREATE OR REPLACE DIRECTORY bfile_dir AS '/private1/LOB/files';
+
+CREATE DISKGROUP dgroup_01
+ EXTERNAL REDUNDANCY
+ DISK '$ORACLE_HOME/disks/c*';
+
+CREATE FUNCTION SecondMax (input NUMBER) RETURN NUMBER
+ PARALLEL_ENABLE AGGREGATE USING SecondMaxImpl;
+
+CREATE OR REPLACE FUNCTION text_length(a CLOB)
+ RETURN NUMBER DETERMINISTIC IS
+ BEGIN
+ RETURN DBMS_LOB.GETLENGTH(a);
+ END;
+/
+
+CREATE INDEXTYPE position_indextype
+ FOR position_between(NUMBER, NUMBER, NUMBER)
+ USING position_im;
+
+CREATE JAVA SOURCE NAMED "Hello" AS
+ public class Hello {
+ public static String hello() {
+ return \"Hello World\"; } };
+
+-- 40 Statements
+
+CREATE JAVA RESOURCE NAMED "appText"
+ USING BFILE (bfile_dir, 'textBundle.dat');
+
+CREATE LIBRARY ext_lib AS '/OR/lib/ext_lib.so';
+/
+
+CREATE OR REPLACE LIBRARY ext_lib IS '/OR/newlib/ext_lib.so';
+/
+
+CREATE LIBRARY app_lib as '${ORACLE_HOME}/lib/app_lib.so'
+ AGENT 'sales.hq.acme.com';
+/
+
+CREATE MATERIALIZED VIEW LOG ON employees
+ WITH PRIMARY KEY
+ INCLUDING NEW VALUES;
+
+CREATE MATERIALIZED VIEW all_customers
+ PCTFREE 5 PCTUSED 60
+ TABLESPACE example
+ STORAGE (INITIAL 50K NEXT 50K)
+ USING INDEX STORAGE (INITIAL 25K NEXT 25K)
+ REFRESH START WITH ROUND(SYSDATE + 1) + 11/24
+ NEXT NEXT_DAY(TRUNC(SYSDATE), 'MONDAY') + 15/24
+ AS SELECT * FROM sh.customers@remote
+ UNION
+ SELECT * FROM sh.customers@local;
+
+CREATE MATERIALIZED VIEW LOG ON product_information
+ WITH ROWID, SEQUENCE (list_price, min_price, category_id)
+ INCLUDING NEW VALUES;
+
+CREATE OPERATOR eq_op
+ BINDING (VARCHAR2, VARCHAR2)
+ RETURN NUMBER
+ USING eq_f;
+
+CREATE OUTLINE salaries FOR CATEGORY special
+ ON SELECT last_name, salary FROM employees;
+
+CREATE OR REPLACE OUTLINE public_salaries
+ FROM PRIVATE my_salaries;
+
+-- 50 Statements so far
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/oracle/oracle_test_statements.ddl
___________________________________________________________________
Name: svn:executable
+ *
Copied: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdl.cnd (from rev 1528, trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd)
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdl.cnd (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdl.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,205 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+
+ //------------------------------------------------------------------------------
+// N A M E S P A C E S
+//------------------------------------------------------------------------------
+<jcr='http://www.jcp.org/jcr/1.0'>
+<nt='http://www.jcp.org/jcr/nt/1.0'>
+<mix='http://www.jcp.org/jcr/mix/1.0'>
+<ddl='http://www.jboss.org/dna/ddl/1.0'>
+<postgresddl='http://www.jboss.org/dna/ddl/postgres/1.0'>
+
+// =============================================================================
+// OPERANDS
+// =============================================================================
+[postgresddl:aggregateOperand] > ddl:operand abstract
+[postgresddl:castOperand] > ddl:operand abstract
+[postgresddl:commentOperand] > ddl:operand abstract
+[postgresddl:constraintTriggerOperand] > ddl:operand abstract
+[postgresddl:conversionOperand] > ddl:operand abstract
+[postgresddl:databaseOperand] > ddl:operand abstract
+[postgresddl:foreignDataOperand] > ddl:operand abstract
+[postgresddl:groupOperand] > ddl:operand abstract
+[postgresddl:functionOperand] > ddl:operand abstract
+[postgresddl:indexOperand] > ddl:operand abstract
+[postgresddl:languageOperand] > ddl:operand abstract
+[postgresddl:operatorOperand] > ddl:operand abstract
+[postgresddl:ownedByOperand] > ddl:operand abstract
+[postgresddl:roleOperand] > ddl:operand abstract
+[postgresddl:ruleOperand] > ddl:operand abstract
+[postgresddl:sequenceOperand] > ddl:operand abstract
+[postgresddl:serverOperand] > ddl:operand abstract
+[postgresddl:tablespaceOperand] > ddl:operand abstract
+[postgresddl:textSearchOperand] > ddl:operand abstract
+[postgresddl:triggerOperand] > ddl:operand abstract
+[postgresddl:typeOperand] > ddl:operand abstract
+[postgresddl:userOperand] > ddl:operand abstract
+[postgresddl:userMappingOperand] > ddl:operand abstract
+[postgresddl:parameterOperand] > ddl:operand abstract
+
+[postgresddl:functionParameter] > postgresddl:parameterOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:nullable (STRING)
+ - ddl:defaultOption (STRING)
+ - postgresddl:mode (STRING)
+
+[postgresddl:role] > postgresddl:roleOperand mixin
+
+[postgresddl:renamedColumn] > ddl:renamable mixin
+
+// =============================================================================
+// ALTER STATEMENTS
+// =============================================================================
+[postgresddl:alterAggregateStatement] > ddl:alterable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:alterConversionStatement] > ddl:alterable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:alterForeignDataWrapperStatement] > ddl:alterable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:alterFunctionStatement] > ddl:alterable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:alterGroupStatement] > ddl:alterable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:alterIndexStatement] > ddl:alterable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:alterLanguageStatement] > ddl:alterable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:alterOperatorStatement] > ddl:alterable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:alterRoleStatement] > ddl:alterable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:alterSchemaStatement] > ddl:alterable, ddl:statement, ddl:schemaOperand mixin
+[postgresddl:alterSequenceStatement] > ddl:alterable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:alterServerStatement] > ddl:alterable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:alterTextSearchStatement] > ddl:alterable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:alterTriggerStatement] > ddl:alterable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:alterTypeStatement] > ddl:alterable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:alterUserStatement] > ddl:alterable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:alterUserMappingStatement] > ddl:alterable, ddl:statement, postgresddl:userMappingOperand mixin
+[postgresddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
+
+[postgresddl:alterTableStatement] > ddl:alterTableStatement mixin
+ - postgresddl:newTableName (STRING)
+ - postgresddl:schemaName (STRING)
+ + postgresddl:renameColumn (postgresddl:renamedColumn) = postgresddl:renamedColumn multiple
+
+
+// =============================================================================
+// CREATE STATEMENTS
+// =============================================================================
+
+[postgresddl:createAggregateStatement] > ddl:creatable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:createCastStatement] > ddl:creatable, ddl:statement, postgresddl:castOperand mixin
+[postgresddl:createConstraintTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:constraintTriggerOperand mixin
+[postgresddl:createConversionStatement] > ddl:creatable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:createDatabaseStatement] > ddl:creatable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:createForeignDataWrapperStatement] > ddl:creatable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:createFunctionStatement] > ddl:creatable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:createGroupStatement] > ddl:creatable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:createIndexStatement] > ddl:creatable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:createLanguageStatement] > ddl:creatable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:createOperatorStatement] > ddl:creatable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:createRoleStatement] > ddl:creatable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:createRuleStatement] > ddl:creatable, ddl:statement, postgresddl:ruleOperand mixin
+[postgresddl:createSequenceStatement] > ddl:creatable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:createServerStatement] > ddl:creatable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:createTablespaceStatement] > ddl:creatable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:createTextSearchStatement] > ddl:creatable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:createTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:createTypeStatement] > ddl:creatable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:createUserStatement] > ddl:creatable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:createUserMappingStatement] > ddl:creatable, ddl:statement, postgresddl:userMappingOperand mixin
+
+// =============================================================================
+// DROP STATEMENTS
+// =============================================================================
+
+[postgresddl:dropAggregateStatement] > ddl:droppable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:dropCastStatement] > ddl:droppable, ddl:statement, postgresddl:castOperand mixin
+[postgresddl:dropConstraintTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:constraintTriggerOperand mixin
+[postgresddl:dropConversionStatement] > ddl:droppable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:dropForeignDataWrapperStatement] > ddl:droppable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:dropFunctionStatement] > ddl:droppable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:dropGroupStatement] > ddl:droppable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:dropIndexStatement] > ddl:droppable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:dropLanguageStatement] > ddl:droppable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:dropOperatorStatement] > ddl:droppable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:dropOwnedByStatement] > ddl:droppable, ddl:statement, postgresddl:ownedByOperand mixin
+[postgresddl:dropRoleStatement] > ddl:droppable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:dropRuleStatement] > ddl:droppable, ddl:statement, postgresddl:ruleOperand mixin
+[postgresddl:dropSequenceStatement] > ddl:droppable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:dropServerStatement] > ddl:droppable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:dropTextSearchStatement] > ddl:droppable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:dropTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:dropTypeStatement] > ddl:droppable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:dropUserStatement] > ddl:droppable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:dropUserMappingStatement] > ddl:droppable, ddl:statement, postgresddl:userMappingOperand mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+
+[postgresddl:abortStatement] > ddl:statement mixin
+[postgresddl:analyzeStatement] > ddl:statement mixin
+[postgresddl:clusterStatement] > ddl:statement mixin
+[postgresddl:commentOnStatement] > ddl:statement, postgresddl:commentOperand mixin
+ - postgresddl:targetObjectType (STRING) mandatory
+ - postgresddl:targetObjectName (STRING)
+ - postgresddl:comment (STRING) mandatory
+[postgresddl:copyStatement] > ddl:statement mixin
+[postgresddl:deallocateStatement] > ddl:statement mixin
+[postgresddl:declareStatement] > ddl:statement mixin
+[postgresddl:discardStatement] > ddl:statement mixin
+[postgresddl:explainStatement] > ddl:statement mixin
+[postgresddl:fetchStatement] > ddl:statement mixin
+[postgresddl:listenStatement] > ddl:statement mixin
+[postgresddl:loadStatement] > ddl:statement mixin
+[postgresddl:lockTableStatement] > ddl:statement mixin
+[postgresddl:moveStatement] > ddl:statement mixin
+[postgresddl:notifyStatement] > ddl:statement mixin
+[postgresddl:prepareStatement] > ddl:statement mixin
+[postgresddl:reassignOwnedStatement] > ddl:statement mixin
+[postgresddl:reindexStatement] > ddl:statement mixin
+[postgresddl:releaseSavepointStatement] > ddl:statement mixin
+[postgresddl:rollbackStatement] > ddl:statement mixin
+[postgresddl:selectIntoStatement] > ddl:statement mixin
+[postgresddl:showStatement] > ddl:statement mixin
+[postgresddl:truncateStatement] > ddl:statement mixin
+[postgresddl:unlistenStatement] > ddl:statement mixin
+[postgresddl:vacuumStatement] > ddl:statement mixin
+
+// =============================================================================
+// GRANT STATEMENTS
+// =============================================================================
+[postgresddl:grantOnTableStatement] > ddl:grantStatement, ddl:tableOperand mixin
+[postgresddl:grantOnSequenceStatement] > ddl:grantStatement, postgresddl:sequenceOperand mixin
+[postgresddl:grantOnDatabaseStatement] > ddl:grantStatement, postgresddl:databaseOperand mixin
+[postgresddl:grantOnForeignDataWrapperStatement] > ddl:grantStatement, postgresddl:foreignDataOperand mixin
+[postgresddl:grantOnForeignServerStatement] > ddl:grantStatement, postgresddl:serverOperand mixin
+[postgresddl:grantOnFunctionStatement] > ddl:grantStatement, postgresddl:functionOperand mixin
+ + postgresddl:parameter (postgresddl:functionParameter) = postgresddl:functionParameter multiple
+[postgresddl:grantOnLanguageStatement] > ddl:grantStatement, postgresddl:languageOperand mixin
+[postgresddl:grantOnSchemaStatement] > ddl:grantStatement, ddl:schemaOperand mixin
+[postgresddl:grantOnTablespaceStatement] > ddl:grantStatement, postgresddl:tablespaceOperand mixin
+[postgresddl:grantRolesStatement] > ddl:grantStatement mixin
+ + postgresddl:grantRole (postgresddl:role) = postgresddl:role multiple
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/PostgresDdl.cnd
___________________________________________________________________
Name: svn:executable
+ *
Copied: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/postgres_test_statements.ddl (from rev 1528, trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl)
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/postgres_test_statements.ddl (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/postgres_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,609 @@
+-- Postgres SQL Statements from postgressql-8.4.1-US.pdf
+--
+-- Extracted 10/5/2009
+
+--COMMENT ON
+--{
+-- TABLE object_name |
+-- COLUMN table_name.column_name |
+-- AGGREGATE agg_name (agg_type [, ...] ) |
+-- CAST (sourcetype AS targettype) |
+-- CONSTRAINT constraint_name ON table_name |
+-- CONVERSION object_name |
+-- DATABASE object_name |
+-- DOMAIN object_name |
+-- FUNCTION func_name ( [ [ argmode ] [ argname ] argtype [, ...] ] ) |
+-- INDEX object_name |
+-- LARGE OBJECT large_object_oid |
+-- OPERATOR op (leftoperand_type, rightoperand_type) |
+-- OPERATOR CLASS object_name USING index_method |
+-- OPERATOR FAMILY object_name USING index_method |
+-- [ PROCEDURAL ] LANGUAGE object_name |
+-- ROLE object_name |
+-- RULE rule_name ON table_name |
+-- SCHEMA object_name |
+-- SEQUENCE object_name |
+-- TABLESPACE object_name |
+-- TEXT SEARCH CONFIGURATION object_name |
+-- TEXT SEARCH DICTIONARY object_name |
+-- TEXT SEARCH PARSER object_name |
+-- TEXT SEARCH TEMPLATE object_name |
+-- TRIGGER trigger_name ON table_name |
+-- TYPE object_name |
+-- VIEW object_name
+--} IS 'text'
+
+COMMENT ON TABLE mytable IS 'This is my table.';
+COMMENT ON TABLE mytable IS NULL;
+COMMENT ON AGGREGATE my_aggregate (double precision) IS 'Computes sample variance';
+COMMENT ON CAST (text AS int4) IS 'Allow casts from text to int4';
+COMMENT ON COLUMN my_table.my_column IS 'Employee ID number';
+COMMENT ON CONVERSION my_conv IS 'Conversion to UTF8';
+COMMENT ON DATABASE my_database IS 'Development Database';
+COMMENT ON DOMAIN my_domain IS 'Email Address Domain';
+COMMENT ON FUNCTION my_function (timestamp) IS 'Returns Roman Numeral';
+COMMENT ON INDEX my_index IS 'Enforces uniqueness on employee ID';
+-- 10 STATEMENTS *******************************************************
+COMMENT ON LANGUAGE plpython IS 'Python support for stored procedures';
+COMMENT ON LARGE OBJECT 346344 IS 'Planning document';
+COMMENT ON OPERATOR ^ (text, text) IS 'Performs intersection of two texts';
+COMMENT ON OPERATOR - (NONE, text) IS 'This is a prefix operator on text';
+COMMENT ON OPERATOR CLASS int4ops USING btree IS '4 byte integer operators for btrees';
+COMMENT ON OPERATOR FAMILY integer_ops USING btree IS 'all integer operators for btrees';
+COMMENT ON ROLE my_role IS 'Administration group for finance tables';
+COMMENT ON RULE my_rule ON my_table IS 'Logs updates of employee records';
+COMMENT ON SCHEMA my_schema IS 'Departmental data';
+COMMENT ON SEQUENCE my_sequence IS 'Used to generate primary keys';
+-- 20 STATEMENTS *******************************************************
+COMMENT ON TABLE my_schema.my_table IS 'Employee Information';
+COMMENT ON TABLESPACE my_tablespace IS 'Tablespace for indexes';
+COMMENT ON TEXT SEARCH CONFIGURATION my_config IS 'Special word filtering';
+COMMENT ON TEXT SEARCH DICTIONARY swedish IS 'Snowball stemmer for swedish language';
+COMMENT ON TEXT SEARCH PARSER my_parser IS 'Splits text into words';
+COMMENT ON TEXT SEARCH TEMPLATE snowball IS 'Snowball stemmer';
+COMMENT ON TRIGGER my_trigger ON my_table IS 'Used for RI';
+COMMENT ON TYPE complex IS 'Complex number data type';
+COMMENT ON VIEW my_view IS 'View of departmental costs';
+--COMMIT [ WORK | TRANSACTION ]
+
+COMMIT WORK;
+-- 30 STATEMENTS *******************************************************
+COMMIT TRANSACTION;
+
+COMMIT;
+
+--COMMIT PREPARED transaction_id;
+
+COMMIT PREPARED 'foobar';
+
+--COPY tablename [ ( column [, ...] ) ]
+-- FROM { 'filename' | STDIN }
+-- [ [ WITH ]
+-- [ BINARY ]
+-- [ OIDS ]
+-- [ DELIMITER [ AS ] 'delimiter ' ]
+-- [ NULL [ AS ] 'null string ' ]
+-- [ CSV [ HEADER ]
+-- [ QUOTE [ AS ] 'quote' ]
+-- [ ESCAPE [ AS ] 'escape' ]
+-- [ FORCE NOT NULL column [, ...] ]
+--COPY { tablename [ ( column [, ...] ) ] | ( query ) }
+-- TO { 'filename' | STDOUT }
+-- [ [ WITH ]
+-- [ BINARY ]
+-- [ OIDS ]
+-- [ DELIMITER [ AS ] 'delimiter ' ]
+-- [ NULL [ AS ] 'null string ' ]
+-- [ CSV [ HEADER ]
+-- [ QUOTE [ AS ] 'quote' ]
+-- [ ESCAPE [ AS ] 'escape' ]
+-- [ FORCE QUOTE column [, ...] ]
+
+COPY country TO STDOUT WITH DELIMITER '|';
+
+COPY country FROM '/usr1/proj/bray/sql/country_data';
+
+COPY (SELECT * FROM country WHERE country_name LIKE 'A%') TO '/usr1/proj/bray/sql/a_list_co';
+
+--CREATE AGGREGATE name ( input_data_type [ , ... ] ) (
+-- SFUNC = sfunc,
+-- STYPE = state_data_type
+-- [ , FINALFUNC = ffunc ]
+-- [ , INITCOND = initial_condition ]
+-- [ , SORTOP = sort_operator ]
+--)
+--or the old syntax
+--CREATE AGGREGATE name (
+-- BASETYPE = base_type,
+-- SFUNC = sfunc,
+-- STYPE = state_data_type
+-- [ , FINALFUNC = ffunc ]
+-- [ , INITCOND = initial_condition ]
+-- [ , SORTOP = sort_operator ]
+--)
+
+
+--CREATE CAST (sourcetype AS targettype)
+-- WITH FUNCTION funcname (argtypes)
+-- [ AS ASSIGNMENT | AS IMPLICIT ]
+--CREATE CAST (sourcetype AS targettype)
+-- WITHOUT FUNCTION
+-- [ AS ASSIGNMENT | AS IMPLICIT ]
+--CREATE CAST (sourcetype AS targettype)
+-- WITH INOUT
+-- [ AS ASSIGNMENT | AS IMPLICIT ]
+
+CREATE CAST (bigint AS int4) WITH FUNCTION int4(bigint) AS ASSIGNMENT;
+
+--CREATE CONSTRAINT TRIGGER name
+-- AFTER event [ OR ... ]
+-- ON table_name
+-- [ FROM referenced_table_name ]
+-- { NOT DEFERRABLE | [ DEFERRABLE ] { INITIALLY IMMEDIATE | INITIALLY DEFERRED } }
+-- FOR EACH ROW
+-- EXECUTE PROCEDURE funcname ( arguments )
+
+--CREATE [ DEFAULT ] CONVERSION name
+-- FOR source_encoding TO dest_encoding FROM funcname
+
+CREATE CONVERSION myconv FOR 'UTF8' TO 'LATIN1' FROM myfunc;
+
+--CREATE DATABASE name
+-- [ [ WITH ] [ OWNER [=] dbowner ]
+-- [ TEMPLATE [=] template ]
+-- [ ENCODING [=] encoding ]
+-- [ LC_COLLATE [=] lc_collate ]
+-- [ LC_CTYPE [=] lc_ctype ]
+-- [ TABLESPACE [=] tablespace ]
+-- [ CONNECTION LIMIT [=] connlimit ] ]
+
+CREATE DATABASE lusiadas;
+
+CREATE DATABASE sales OWNER salesapp TABLESPACE salesspace;
+-- 40 STATEMENTS *******************************************************
+CREATE DATABASE music ENCODING 'LATIN1' TEMPLATE template0;
+
+
+--CREATE DOMAIN name [ AS ] data_type
+-- [ DEFAULT expression ]
+-- [ constraint [ ... ] ]
+--where constraint is:
+--[ CONSTRAINT constraint_name ]
+--{ NOT NULL | NULL | CHECK (expression) }
+
+CREATE DOMAIN us_postal_code AS TEXT
+ CHECK(
+ VALUE ~ '^\\d{5}$'
+ OR VALUE ~ '^\\d{5}-\\d{4}$'
+ );
+
+--CREATE FOREIGN DATA WRAPPER name
+-- [ VALIDATOR valfunction | NO VALIDATOR ]
+-- [ OPTIONS ( option 'value' [, ... ] ) ]
+
+CREATE FOREIGN DATA WRAPPER dummy;
+
+CREATE FOREIGN DATA WRAPPER postgresql VALIDATOR postgresql_fdw_validator;
+
+CREATE FOREIGN DATA WRAPPER mywrapper
+ OPTIONS (debug 'true');
+
+--CREATE [ OR REPLACE ] FUNCTION
+-- name ( [ [ argmode ] [ argname ] argtype [ { DEFAULT | = } defexpr ] [, ...] ] )
+-- [ RETURNS rettype
+-- | RETURNS TABLE ( colname coltype [, ...] ) ]
+-- { LANGUAGE langname
+-- | WINDOW
+-- | IMMUTABLE | STABLE | VOLATILE
+-- | CALLED ON NULL INPUT | RETURNS NULL ON NULL INPUT | STRICT
+-- | [ EXTERNAL ] SECURITY INVOKER | [ EXTERNAL ] SECURITY DEFINER
+-- | COST execution_cost
+-- | ROWS result_rows
+-- | SET configuration_parameter { TO value | = value | FROM CURRENT }
+-- | AS 'definition'
+-- | AS 'obj_file', 'link_symbol'
+-- } ...
+-- [ WITH ( attribute [, ...] ) ]
+
+CREATE FUNCTION add(integer, integer) RETURNS integer
+ AS 'select $1 + $2;'
+ LANGUAGE SQL
+ IMMUTABLE
+ RETURNS NULL ON NULL INPUT;
+
+CREATE OR REPLACE FUNCTION increment(i integer) RETURNS integer AS $$
+ BEGIN
+ RETURN i + 1;
+ END;
+
+CREATE FUNCTION dup(in int, out f1 int, out f2 text)
+ AS $$ SELECT $1, CAST($1 AS text) || ' is text' $$
+ LANGUAGE SQL;
+
+CREATE FUNCTION dup(int) RETURNS dup_result
+ AS $$ SELECT $1, CAST($1 AS text) || ' is text' $$
+ LANGUAGE SQL;
+
+CREATE FUNCTION dup(int) RETURNS TABLE(f1 int, f2 text)
+ AS $$ SELECT $1, CAST($1 AS text) || ' is text' $$
+ LANGUAGE SQL;
+-- 50 STATEMENTS *******************************************************
+--CREATE GROUP name [ [ WITH ] option [ ... ] ]
+--where option can be:
+-- SUPERUSER | NOSUPERUSER
+-- | CREATEDB | NOCREATEDB
+-- | CREATEROLE | NOCREATEROLE
+-- | CREATEUSER | NOCREATEUSER
+-- | INHERIT | NOINHERIT
+-- | LOGIN | NOLOGIN
+-- | [ ENCRYPTED | UNENCRYPTED ] PASSWORD 'password '
+-- | VALID UNTIL 'timestamp'
+-- | IN ROLE rolename [, ...]
+-- | IN GROUP rolename [, ...]
+-- | ROLE rolename [, ...]
+-- | ADMIN rolename [, ...]
+-- | USER rolename [, ...]
+-- | SYSID uid
+
+--CREATE [ UNIQUE ] INDEX [ CONCURRENTLY ] name ON table [ USING method ]
+-- ( { column | ( expression ) } [ opclass ] [ ASC | DESC ] [ NULLS { FIRST | LAST } ] [, ..
+-- [ WITH ( storage_parameter = value [, ... ] ) ]
+-- [ TABLESPACE tablespace ]
+-- [ WHERE predicate ]
+
+CREATE UNIQUE INDEX title_idx ON films (title);
+
+CREATE INDEX lower_title_idx ON films ((lower(title)));
+
+CREATE INDEX title_idx_nulls_low ON films (title NULLS FIRST);
+
+CREATE UNIQUE INDEX title_idx ON films (title) WITH (fillfactor = 70);
+
+CREATE INDEX gin_idx ON documents_table (locations) WITH (fastupdate = off);
+
+CREATE INDEX code_idx ON films(code) TABLESPACE indexspace;
+
+CREATE INDEX CONCURRENTLY sales_quantity_index ON sales_table (quantity);
+
+--CREATE [ PROCEDURAL ] LANGUAGE name
+--CREATE [ TRUSTED ] [ PROCEDURAL ] LANGUAGE name
+-- HANDLER call_handler [ VALIDATOR valfunction ]
+
+CREATE LANGUAGE plpgsql;
+
+CREATE PROCEDURAL LANGUAGE plpgsql;
+
+CREATE TRUSTED PROCEDURAL LANGUAGE plpgsql;
+-- 60 STATEMENTS *******************************************************
+CREATE LANGUAGE plsample
+ HANDLER plsample_call_handler;
+
+--CREATE OPERATOR name (
+-- PROCEDURE = funcname
+-- [, LEFTARG = lefttype ] [, RIGHTARG = righttype ]
+-- [, COMMUTATOR = com_op ] [, NEGATOR = neg_op ]
+-- [, RESTRICT = res_proc ] [, JOIN = join_proc ]
+-- [, HASHES ] [, MERGES ]
+--)
+
+CREATE OPERATOR === (
+ LEFTARG = box,
+ RIGHTARG = box,
+ PROCEDURE = area_equal_procedure,
+ COMMUTATOR = ===,
+ NEGATOR = !==,
+ RESTRICT = area_restriction_procedure,
+ JOIN = area_join_procedure,
+ HASHES, MERGES
+);
+
+--CREATE OPERATOR CLASS name [ DEFAULT ] FOR TYPE data_type
+-- USING index_method [ FAMILY family_name ] AS
+-- { OPERATOR strategy_number operator_name [ ( op_type, op_type ) ]
+-- | FUNCTION support_number [ ( op_type [ , op_type ] ) ] funcname ( argument_type [, ...] )
+-- | STORAGE storage_type
+-- } [, ... ]
+
+CREATE OPERATOR CLASS gist__int_ops
+ DEFAULT FOR TYPE _int4 USING gist AS
+ OPERATOR 3 &&,
+ OPERATOR 6 = (anyarray, anyarray),
+ OPERATOR 7 @>,
+ <@,
+ OPERATOR 8
+ OPERATOR 20 @@ (_int4, query_int),
+ FUNCTION 1 g_int_consistent (internal, _int4, int, oid, internal),
+ FUNCTION 2 g_int_union (internal, internal),
+ FUNCTION 3 g_int_compress (internal),
+ FUNCTION 4 g_int_decompress (internal),
+ FUNCTION 5 g_int_penalty (internal, internal, internal),
+ FUNCTION 6 g_int_picksplit (internal, internal),
+ FUNCTION 7 g_int_same (_int4, _int4, internal);
+
+--CREATE OPERATOR FAMILY name USING index_method
+
+CREATE OPERATOR FAMILY name USING index_method;
+
+--CREATE ROLE name [ [ WITH ] option [ ... ] ]
+--where option can be:
+-- SUPERUSER | NOSUPERUSER
+-- | CREATEDB | NOCREATEDB
+-- | CREATEROLE | NOCREATEROLE
+-- | CREATEUSER | NOCREATEUSER
+-- | INHERIT | NOINHERIT
+-- | LOGIN | NOLOGIN
+-- | CONNECTION LIMIT connlimit
+-- | [ ENCRYPTED | UNENCRYPTED ] PASSWORD 'password '
+-- | VALID UNTIL 'timestamp'
+-- | IN ROLE rolename [, ...]
+-- | IN GROUP rolename [, ...]
+-- | ROLE rolename [, ...]
+-- | ADMIN rolename [, ...]
+-- | USER rolename [, ...]
+-- | SYSID uid
+
+CREATE ROLE jonathan LOGIN;
+
+CREATE USER davide WITH PASSWORD 'jw8s0F4';
+
+CREATE ROLE miriam WITH LOGIN PASSWORD 'jw8s0F4' VALID UNTIL '2005-01-01';
+
+CREATE ROLE admin WITH CREATEDB CREATEROLE;
+
+--CREATE [ OR REPLACE ] RULE name AS ON event
+-- TO table [ WHERE condition ]
+-- DO [ ALSO | INSTEAD ] { NOTHING | command | ( command ; command ... ) }
+
+CREATE RULE "_RETURN" AS
+ ON SELECT TO t1
+ DO INSTEAD
+ SELECT * FROM t2;
+
+CREATE RULE "_RETURN" AS
+ ON SELECT TO t2
+ DO INSTEAD
+ SELECT * FROM t1;
+-- 70 STATEMENTS *******************************************************
+CREATE RULE notify_me AS ON UPDATE TO mytable DO ALSO NOTIFY mytable;
+
+--CREATE SCHEMA schemaname [ AUTHORIZATION username ] [ schema_element [ ... ] ]
+--CREATE SCHEMA AUTHORIZATION username [ schema_element [ ... ] ]
+
+CREATE SCHEMA myschema;
+
+CREATE SCHEMA AUTHORIZATION joe;
+
+CREATE SCHEMA hollywood
+ CREATE TABLE films (title text, release date, awards text[])
+ CREATE VIEW winners AS
+ SELECT title, release FROM films WHERE awards IS NOT NULL;
+
+--CREATE [ TEMPORARY | TEMP ] SEQUENCE name [ INCREMENT [ BY ] increment ]
+-- [ MINVALUE minvalue | NO MINVALUE ] [ MAXVALUE maxvalue | NO MAXVALUE ]
+-- [ START [ WITH ] start ] [ CACHE cache ] [ [ NO ] CYCLE ]
+-- [ OWNED BY { table.column | NONE } ]
+
+CREATE SEQUENCE serial START 101;
+
+--CREATE SERVER servername [ TYPE 'servertype' ] [ VERSION 'serverversion' ]
+-- FOREIGN DATA WRAPPER fdwname
+-- [ OPTIONS ( option 'value' [, ... ] ) ]
+
+CREATE SERVER foo FOREIGN DATA WRAPPER "default";
+
+CREATE SERVER myserver FOREIGN DATA WRAPPER pgsql OPTIONS (host 'foo', dbname 'foodb', port);
+
+--CREATE [ [ GLOBAL | LOCAL ] { TEMPORARY | TEMP } ] TABLE table_name
+-- ( [
+-- { column_name data_type [ DEFAULT default_expr ] [ column_constraint [ ... ] ]
+-- | table_constraint
+-- | LIKE parent_table [ { INCLUDING | EXCLUDING } { DEFAULTS | CONSTRAINTS | INDEXES } ] .
+-- [, ... ]
+-- ] )
+--[ INHERITS ( parent_table [, ... ] ) ]
+--[ WITH ( storage_parameter [= value] [, ... ] ) | WITH OIDS | WITHOUT OIDS ]
+--[ ON COMMIT { PRESERVE ROWS | DELETE ROWS | DROP } ]
+--[ TABLESPACE tablespace ]
+--where column_constraint is:
+--[ CONSTRAINT constraint_name ]
+--{ NOT NULL |
+-- NULL |
+-- UNIQUE index_parameters |
+-- PRIMARY KEY index_parameters |
+-- CHECK ( expression ) |
+-- REFERENCES reftable [ ( refcolumn ) ] [ MATCH FULL | MATCH PARTIAL | MATCH SIMPLE ]
+-- [ ON DELETE action ] [ ON UPDATE action ] }
+--[ DEFERRABLE | NOT DEFERRABLE ] [ INITIALLY DEFERRED | INITIALLY IMMEDIATE ]
+--and table_constraint is:
+--[ CONSTRAINT constraint_name ]
+--{ UNIQUE ( column_name [, ... ] ) index_parameters |
+-- PRIMARY KEY ( column_name [, ... ] ) index_parameters |
+-- CHECK ( expression ) |
+-- FOREIGN KEY ( column_name [, ... ] ) REFERENCES reftable [ ( refcolumn [, ... ] ) ]
+-- [ MATCH FULL | MATCH PARTIAL | MATCH SIMPLE ] [ ON DELETE action ] [ ON UPDATE action ]
+--[ DEFERRABLE | NOT DEFERRABLE ] [ INITIALLY DEFERRED | INITIALLY IMMEDIATE ]
+--index_parameters in UNIQUE and PRIMARY KEY constraints are:
+--[ WITH ( storage_parameter [= value] [, ... ] ) ]
+--[ USING INDEX TABLESPACE tablespace ]
+
+CREATE TABLE films (
+ code char(5) CONSTRAINT firstkey PRIMARY KEY,
+ title varchar(40) NOT NULL,
+ did integer NOT NULL,
+ date_prod date,
+ kind varchar(10),
+ len interval hour to minute
+);
+
+CREATE TABLE distributors (
+ did integer PRIMARY KEY DEFAULT nextval('serial'),
+ name varchar(40) NOT NULL CHECK (name <> ”)
+
+);
+
+CREATE TABLE array_int (
+ vector int[][]
+);
+-- 80 STATEMENTS *******************************************************
+CREATE TABLE films (
+ code char(5),
+ title varchar(40),
+ did integer,
+ date_prod date,
+ kind varchar(10),
+ len interval hour to minute,
+ CONSTRAINT production UNIQUE(date_prod)
+);
+
+CREATE TABLE distributors (
+ did integer CHECK (did > 100),
+ name varchar(40)
+);
+
+CREATE TABLE distributors (
+ did integer,
+ name varchar(40)
+ CONSTRAINT con1 CHECK (did > 100 AND name <> ”)
+);
+
+CREATE TABLE films (
+ code char(5),
+ title varchar(40),
+ did integer,
+ date_prod date,
+ kind varchar(10),
+ len interval hour to minute,
+ CONSTRAINT code_title PRIMARY KEY(code,title)
+);
+
+CREATE TABLE films (
+ code char(5),
+ title varchar(40),
+ did integer,
+ date_prod date,
+ kind varchar(10),
+ len interval hour to minute,
+ CONSTRAINT code_title PRIMARY KEY(code,title)
+);
+
+CREATE TABLE distributors (
+ name varchar(40) DEFAULT 'Luso Films',
+ did integer DEFAULT nextval('distributors_serial'),
+ modtime timestamp DEFAULT current_timestamp
+);
+
+CREATE TABLE distributors (
+ did integer CONSTRAINT no_null NOT NULL,
+ name varchar(40) NOT NULL
+);
+
+CREATE TABLE distributors (
+ did integer,
+ name varchar(40) UNIQUE
+);
+
+CREATE TABLE distributors (
+ did integer,
+ name varchar(40),
+ UNIQUE(name)
+);
+
+CREATE TABLE distributors (
+ did integer,
+ name varchar(40),
+ UNIQUE(name) WITH (fillfactor=70)
+)
+WITH (fillfactor=70);
+-- 90 STATEMENTS *******************************************************
+CREATE TABLE cinemas (
+ id serial,
+ name text,
+ location text
+) TABLESPACE diskvol1;
+
+--CREATE [ [ GLOBAL | LOCAL ] { TEMPORARY | TEMP } ] TABLE table_name
+-- [ (column_name [, ...] ) ]
+-- [ WITH ( storage_parameter [= value] [, ... ] ) | WITH OIDS | WITHOUT OIDS ]
+-- [ ON COMMIT { PRESERVE ROWS | DELETE ROWS | DROP } ]
+-- [ TABLESPACE tablespace ]
+-- AS query
+-- [ WITH [ NO ] DATA ]
+
+CREATE TABLE films_recent AS
+ SELECT * FROM films WHERE date_prod >= '2002-01-01';
+
+CREATE TABLE films2 AS
+ TABLE films;
+
+CREATE TEMP TABLE films_recent WITH (OIDS) ON COMMIT DROP AS
+ EXECUTE recentfilms('2002-01-01');
+
+--CREATE TABLESPACE tablespacename [ OWNER username ] LOCATION 'directory '
+
+CREATE TABLESPACE dbspace LOCATION '/data/dbs';
+
+CREATE TABLESPACE indexspace OWNER genevieve LOCATION '/data/indexes';
+
+--CREATE TEXT SEARCH CONFIGURATION name (
+-- PARSER = parser_name |
+-- COPY = source_config
+--)
+
+CREATE TEXT SEARCH CONFIGURATION my_search_config (
+ PARSER = my_parser
+);
+
+--CREATE TEXT SEARCH DICTIONARY name (
+-- TEMPLATE = template
+-- [, option = value [, ... ]]
+--)
+
+CREATE TEXT SEARCH DICTIONARY my_russian (
+ template = snowball,
+ language = russian,
+ stopwords = myrussian
+);
+
+--CREATE TEXT SEARCH PARSER name (
+-- START = start_function ,
+-- GETTOKEN = gettoken_function ,
+-- END = end_function ,
+-- LEXTYPES = lextypes_function
+-- [, HEADLINE = headline_function ]
+--)
+
+CREATE TEXT SEARCH PARSER my_search_parser (
+ START = startNow(),
+ GETTOKEN = getToken(),
+ END = end(),
+ LEXTYPES = getLexTypes()
+);
+
+--CREATE TEXT SEARCH TEMPLATE name (
+-- [ INIT = init_function , ]
+-- LEXIZE = lexize_function
+--)
+
+CREATE TEXT SEARCH TEMPLATE my_search_template (
+ LEXIZE = lexizeNow()
+);
+
+--CREATE TRIGGER name { BEFORE | AFTER } { event [ OR ... ] }
+-- ON table [ FOR [ EACH ] { ROW | STATEMENT } ]
+-- EXECUTE PROCEDURE funcname ( arguments )
+-- 100 STATEMENTS *******************************************************
+CREATE TRIGGER trigger_name BEFORE dawn
+ ON table
+ EXECUTE PROCEDURE funcname ( 'arg1', 'arg2' );
+
+ALTER TABLE foreign_companies RENAME COLUMN address TO city;
+
+ALTER TABLE us_companies RENAME TO suppliers;
+
+ALTER TABLE old_addresses ALTER COLUMN street SET NOT NULL;
+
+ALTER TABLE new_addresses ALTER COLUMN street DROP NOT NULL;
+
+GRANT EXECUTE ON FUNCTION divideByTwo(numerator int, IN demoninator int) TO george;
+
+-- 106 STATEMENTS *******************************************************
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/dialect/postgres/postgres_test_statements.ddl
___________________________________________________________________
Name: svn:executable
+ *
Added: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/grant_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/grant_test_statements.ddl (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/grant_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,7 @@
+GRANT SELECT ON TABLE purchaseOrders TO maria,harry;
+
+GRANT UPDATE, USAGE ON TABLE billedOrders TO anita,zhi;
+
+GRANT SELECT ON TABLE orders.bills to PUBLIC;
+
+GRANT INSERT(a, b, c) ON TABLE purchaseOrders TO purchases_reader_role;
\ No newline at end of file
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/oracle_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/oracle_test_statements.ddl 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/oracle_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -1,234 +0,0 @@
---
--- SAMPLE ORACLE STATEMENTS
---
-
-ALTER TABLE employees
- PCTFREE 30
- PCTUSED 60;
-
-ALTER TABLE countries
- ADD (duty_pct NUMBER(2,2) CHECK (duty_pct < 10.5),
- visa_needed VARCHAR2(3));
-
-ALTER TABLESPACE tbs_01
- BEGIN BACKUP;
-
-ALTER TABLESPACE omf_ts1 ADD DATAFILE;
-
-ALTER TABLESPACE undots1
- RETENTION NOGUARANTEE;
-
-ALTER TRIGGER update_job_history DISABLE;
-
-ALTER TYPE data_typ
- ADD MEMBER FUNCTION qtr(der_qtr DATE)
- RETURN CHAR CASCADE;
-
-ALTER TYPE cust_address_typ
- ADD ATTRIBUTE (phone phone_list_typ) CASCADE;
-
-ALTER TYPE phone_list_typ
- MODIFY ELEMENT TYPE VARCHAR(64) CASCADE;
-
-ALTER USER app_user1
- GRANT CONNECT THROUGH sh
- WITH ROLE warehouse_user;
-
--- 10 Statements
-
-ALTER USER app_user1 IDENTIFIED GLOBALLY AS 'CN=tom,O=oracle,C=US';
-
-ALTER USER sidney
- IDENTIFIED BY second_2nd_pwd
- DEFAULT TABLESPACE example;
-
-ALTER VIEW customer_ro
- COMPILE;
-
-ANALYZE TABLE customers VALIDATE STRUCTURE ONLINE;
-
-ANALYZE TABLE employees VALIDATE STRUCTURE CASCADE;
-
-ANALYZE TABLE orders DELETE STATISTICS;
-
-ASSOCIATE STATISTICS WITH PACKAGES emp_mgmt DEFAULT SELECTIVITY 10;
-
-AUDIT SELECT
- ON hr.employees
- WHENEVER SUCCESSFUL;
-
-AUDIT INSERT, UPDATE
- ON oe.customers;
-
-AUDIT DELETE ANY TABLE;
-
--- 20 Statements
-
-AUDIT ROLE
- WHENEVER SUCCESSFUL;
-
-COMMENT ON COLUMN employees.job_id
- IS 'abbreviated job title';
-
-COMMIT WORK;
-
-COMMIT COMMENT 'In-doubt transaction Code 36, Call (415) 555-2637';
-
-CREATE CLUSTER personnel
- (department NUMBER(4))
-SIZE 512
-STORAGE (initial 100K next 50K);
-
-CREATE CLUSTER address
- (postal_code NUMBER, country_id CHAR(2))
- HASHKEYS 20
- HASH IS MOD(postal_code + country_id, 101);
-
-CREATE CLUSTER cust_orders (customer_id NUMBER(6))
- SIZE 512 SINGLE TABLE HASHKEYS 100;
-
-CREATE CONTEXT hr_context USING emp_mgmt;
-
-CREATE CONTROLFILE REUSE DATABASE "demo" NORESETLOGS NOARCHIVELOG
- MAXLOGFILES 32
- MAXLOGMEMBERS 2
- MAXDATAFILES 32
- MAXINSTANCES 1
- MAXLOGHISTORY 449
- LOGFILE
- GROUP 1 '/path/oracle/dbs/t_log1.f' SIZE 500K,
- GROUP 2 '/path/oracle/dbs/t_log2.f' SIZE 500K
- # STANDBY LOGFILE
- DATAFILE
- '/path/oracle/dbs/t_db1.f',
- '/path/oracle/dbs/dbu19i.dbf',
- '/path/oracle/dbs/tbs_11.f',
- '/path/oracle/dbs/smundo.dbf',
- '/path/oracle/dbs/demo.dbf'
- CHARACTER SET WE8DEC
- ;
-
-CREATE DATABASE sample
- CONTROLFILE REUSE
- LOGFILE
- GROUP 1 ('diskx:log1.log', 'disky:log1.log') SIZE 50K,
- GROUP 2 ('diskx:log2.log', 'disky:log2.log') SIZE 50K
- MAXLOGFILES 5
- MAXLOGHISTORY 100
- MAXDATAFILES 10
- MAXINSTANCES 2
- ARCHIVELOG
- CHARACTER SET AL32UTF8
- NATIONAL CHARACTER SET AL16UTF16
- DATAFILE
- 'disk1:df1.dbf' AUTOEXTEND ON,
- 'disk2:df2.dbf' AUTOEXTEND ON NEXT 10M MAXSIZE UNLIMITED
- DEFAULT TEMPORARY TABLESPACE temp_ts
- UNDO TABLESPACE undo_ts
- SET TIME_ZONE = '+02:00';
-
--- 30 Statements
-
-CREATE PUBLIC DATABASE LINK remote
- USING 'remote';
-
-CREATE DATABASE LINK local
- CONNECT TO hr IDENTIFIED BY hr
- USING 'local';
-
-CREATE DIMENSION customers_dim
- LEVEL customer IS (customers.cust_id)
- LEVEL city IS (customers.cust_city)
- LEVEL state IS (customers.cust_state_province)
- LEVEL country IS (countries.country_id)
- LEVEL subregion IS (countries.country_subregion)
- LEVEL region IS (countries.country_region)
- HIERARCHY geog_rollup (
- customer CHILD OF
- city CHILD OF
- state CHILD OF
- country CHILD OF
- subregion CHILD OF
- region
- JOIN KEY (customers.country_id) REFERENCES country
- )
- ATTRIBUTE customer DETERMINES
- (cust_first_name, cust_last_name, cust_gender,
- cust_marital_status, cust_year_of_birth,
- cust_income_level, cust_credit_limit)
- ATTRIBUTE country DETERMINES (countries.country_name)
-;
-
-CREATE DIRECTORY admin AS 'oracle/admin';
-
-CREATE OR REPLACE DIRECTORY bfile_dir AS '/private1/LOB/files';
-
-CREATE DISKGROUP dgroup_01
- EXTERNAL REDUNDANCY
- DISK '$ORACLE_HOME/disks/c*';
-
-CREATE FUNCTION SecondMax (input NUMBER) RETURN NUMBER
- PARALLEL_ENABLE AGGREGATE USING SecondMaxImpl;
-
-CREATE OR REPLACE FUNCTION text_length(a CLOB)
- RETURN NUMBER DETERMINISTIC IS
- BEGIN
- RETURN DBMS_LOB.GETLENGTH(a);
- END;
-/
-
-CREATE INDEXTYPE position_indextype
- FOR position_between(NUMBER, NUMBER, NUMBER)
- USING position_im;
-
-CREATE JAVA SOURCE NAMED "Hello" AS
- public class Hello {
- public static String hello() {
- return \"Hello World\"; } };
-
--- 40 Statements
-
-CREATE JAVA RESOURCE NAMED "appText"
- USING BFILE (bfile_dir, 'textBundle.dat');
-
-CREATE LIBRARY ext_lib AS '/OR/lib/ext_lib.so';
-/
-
-CREATE OR REPLACE LIBRARY ext_lib IS '/OR/newlib/ext_lib.so';
-/
-
-CREATE LIBRARY app_lib as '${ORACLE_HOME}/lib/app_lib.so'
- AGENT 'sales.hq.acme.com';
-/
-
-CREATE MATERIALIZED VIEW LOG ON employees
- WITH PRIMARY KEY
- INCLUDING NEW VALUES;
-
-CREATE MATERIALIZED VIEW all_customers
- PCTFREE 5 PCTUSED 60
- TABLESPACE example
- STORAGE (INITIAL 50K NEXT 50K)
- USING INDEX STORAGE (INITIAL 25K NEXT 25K)
- REFRESH START WITH ROUND(SYSDATE + 1) + 11/24
- NEXT NEXT_DAY(TRUNC(SYSDATE), 'MONDAY') + 15/24
- AS SELECT * FROM sh.customers@remote
- UNION
- SELECT * FROM sh.customers@local;
-
-CREATE MATERIALIZED VIEW LOG ON product_information
- WITH ROWID, SEQUENCE (list_price, min_price, category_id)
- INCLUDING NEW VALUES;
-
-CREATE OPERATOR eq_op
- BINDING (VARCHAR2, VARCHAR2)
- RETURN NUMBER
- USING eq_f;
-
-CREATE OUTLINE salaries FOR CATEGORY special
- ON SELECT last_name, salary FROM employees;
-
-CREATE OR REPLACE OUTLINE public_salaries
- FROM PRIVATE my_salaries;
-
--- 50 Statements so far
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -1,609 +0,0 @@
--- Postgres SQL Statements from postgressql-8.4.1-US.pdf
---
--- Extracted 10/5/2009
-
---COMMENT ON
---{
--- TABLE object_name |
--- COLUMN table_name.column_name |
--- AGGREGATE agg_name (agg_type [, ...] ) |
--- CAST (sourcetype AS targettype) |
--- CONSTRAINT constraint_name ON table_name |
--- CONVERSION object_name |
--- DATABASE object_name |
--- DOMAIN object_name |
--- FUNCTION func_name ( [ [ argmode ] [ argname ] argtype [, ...] ] ) |
--- INDEX object_name |
--- LARGE OBJECT large_object_oid |
--- OPERATOR op (leftoperand_type, rightoperand_type) |
--- OPERATOR CLASS object_name USING index_method |
--- OPERATOR FAMILY object_name USING index_method |
--- [ PROCEDURAL ] LANGUAGE object_name |
--- ROLE object_name |
--- RULE rule_name ON table_name |
--- SCHEMA object_name |
--- SEQUENCE object_name |
--- TABLESPACE object_name |
--- TEXT SEARCH CONFIGURATION object_name |
--- TEXT SEARCH DICTIONARY object_name |
--- TEXT SEARCH PARSER object_name |
--- TEXT SEARCH TEMPLATE object_name |
--- TRIGGER trigger_name ON table_name |
--- TYPE object_name |
--- VIEW object_name
---} IS 'text'
-
-COMMENT ON TABLE mytable IS 'This is my table.';
-COMMENT ON TABLE mytable IS NULL;
-COMMENT ON AGGREGATE my_aggregate (double precision) IS 'Computes sample variance';
-COMMENT ON CAST (text AS int4) IS 'Allow casts from text to int4';
-COMMENT ON COLUMN my_table.my_column IS 'Employee ID number';
-COMMENT ON CONVERSION my_conv IS 'Conversion to UTF8';
-COMMENT ON DATABASE my_database IS 'Development Database';
-COMMENT ON DOMAIN my_domain IS 'Email Address Domain';
-COMMENT ON FUNCTION my_function (timestamp) IS 'Returns Roman Numeral';
-COMMENT ON INDEX my_index IS 'Enforces uniqueness on employee ID';
--- 10 STATEMENTS *******************************************************
-COMMENT ON LANGUAGE plpython IS 'Python support for stored procedures';
-COMMENT ON LARGE OBJECT 346344 IS 'Planning document';
-COMMENT ON OPERATOR ^ (text, text) IS 'Performs intersection of two texts';
-COMMENT ON OPERATOR - (NONE, text) IS 'This is a prefix operator on text';
-COMMENT ON OPERATOR CLASS int4ops USING btree IS '4 byte integer operators for btrees';
-COMMENT ON OPERATOR FAMILY integer_ops USING btree IS 'all integer operators for btrees';
-COMMENT ON ROLE my_role IS 'Administration group for finance tables';
-COMMENT ON RULE my_rule ON my_table IS 'Logs updates of employee records';
-COMMENT ON SCHEMA my_schema IS 'Departmental data';
-COMMENT ON SEQUENCE my_sequence IS 'Used to generate primary keys';
--- 20 STATEMENTS *******************************************************
-COMMENT ON TABLE my_schema.my_table IS 'Employee Information';
-COMMENT ON TABLESPACE my_tablespace IS 'Tablespace for indexes';
-COMMENT ON TEXT SEARCH CONFIGURATION my_config IS 'Special word filtering';
-COMMENT ON TEXT SEARCH DICTIONARY swedish IS 'Snowball stemmer for swedish language';
-COMMENT ON TEXT SEARCH PARSER my_parser IS 'Splits text into words';
-COMMENT ON TEXT SEARCH TEMPLATE snowball IS 'Snowball stemmer';
-COMMENT ON TRIGGER my_trigger ON my_table IS 'Used for RI';
-COMMENT ON TYPE complex IS 'Complex number data type';
-COMMENT ON VIEW my_view IS 'View of departmental costs';
---COMMIT [ WORK | TRANSACTION ]
-
-COMMIT WORK;
--- 30 STATEMENTS *******************************************************
-COMMIT TRANSACTION;
-
-COMMIT;
-
---COMMIT PREPARED transaction_id;
-
-COMMIT PREPARED 'foobar';
-
---COPY tablename [ ( column [, ...] ) ]
--- FROM { 'filename' | STDIN }
--- [ [ WITH ]
--- [ BINARY ]
--- [ OIDS ]
--- [ DELIMITER [ AS ] 'delimiter ' ]
--- [ NULL [ AS ] 'null string ' ]
--- [ CSV [ HEADER ]
--- [ QUOTE [ AS ] 'quote' ]
--- [ ESCAPE [ AS ] 'escape' ]
--- [ FORCE NOT NULL column [, ...] ]
---COPY { tablename [ ( column [, ...] ) ] | ( query ) }
--- TO { 'filename' | STDOUT }
--- [ [ WITH ]
--- [ BINARY ]
--- [ OIDS ]
--- [ DELIMITER [ AS ] 'delimiter ' ]
--- [ NULL [ AS ] 'null string ' ]
--- [ CSV [ HEADER ]
--- [ QUOTE [ AS ] 'quote' ]
--- [ ESCAPE [ AS ] 'escape' ]
--- [ FORCE QUOTE column [, ...] ]
-
-COPY country TO STDOUT WITH DELIMITER '|';
-
-COPY country FROM '/usr1/proj/bray/sql/country_data';
-
-COPY (SELECT * FROM country WHERE country_name LIKE 'A%') TO '/usr1/proj/bray/sql/a_list_co';
-
---CREATE AGGREGATE name ( input_data_type [ , ... ] ) (
--- SFUNC = sfunc,
--- STYPE = state_data_type
--- [ , FINALFUNC = ffunc ]
--- [ , INITCOND = initial_condition ]
--- [ , SORTOP = sort_operator ]
---)
---or the old syntax
---CREATE AGGREGATE name (
--- BASETYPE = base_type,
--- SFUNC = sfunc,
--- STYPE = state_data_type
--- [ , FINALFUNC = ffunc ]
--- [ , INITCOND = initial_condition ]
--- [ , SORTOP = sort_operator ]
---)
-
-
---CREATE CAST (sourcetype AS targettype)
--- WITH FUNCTION funcname (argtypes)
--- [ AS ASSIGNMENT | AS IMPLICIT ]
---CREATE CAST (sourcetype AS targettype)
--- WITHOUT FUNCTION
--- [ AS ASSIGNMENT | AS IMPLICIT ]
---CREATE CAST (sourcetype AS targettype)
--- WITH INOUT
--- [ AS ASSIGNMENT | AS IMPLICIT ]
-
-CREATE CAST (bigint AS int4) WITH FUNCTION int4(bigint) AS ASSIGNMENT;
-
---CREATE CONSTRAINT TRIGGER name
--- AFTER event [ OR ... ]
--- ON table_name
--- [ FROM referenced_table_name ]
--- { NOT DEFERRABLE | [ DEFERRABLE ] { INITIALLY IMMEDIATE | INITIALLY DEFERRED } }
--- FOR EACH ROW
--- EXECUTE PROCEDURE funcname ( arguments )
-
---CREATE [ DEFAULT ] CONVERSION name
--- FOR source_encoding TO dest_encoding FROM funcname
-
-CREATE CONVERSION myconv FOR 'UTF8' TO 'LATIN1' FROM myfunc;
-
---CREATE DATABASE name
--- [ [ WITH ] [ OWNER [=] dbowner ]
--- [ TEMPLATE [=] template ]
--- [ ENCODING [=] encoding ]
--- [ LC_COLLATE [=] lc_collate ]
--- [ LC_CTYPE [=] lc_ctype ]
--- [ TABLESPACE [=] tablespace ]
--- [ CONNECTION LIMIT [=] connlimit ] ]
-
-CREATE DATABASE lusiadas;
-
-CREATE DATABASE sales OWNER salesapp TABLESPACE salesspace;
--- 40 STATEMENTS *******************************************************
-CREATE DATABASE music ENCODING 'LATIN1' TEMPLATE template0;
-
-
---CREATE DOMAIN name [ AS ] data_type
--- [ DEFAULT expression ]
--- [ constraint [ ... ] ]
---where constraint is:
---[ CONSTRAINT constraint_name ]
---{ NOT NULL | NULL | CHECK (expression) }
-
-CREATE DOMAIN us_postal_code AS TEXT
- CHECK(
- VALUE ~ '^\\d{5}$'
- OR VALUE ~ '^\\d{5}-\\d{4}$'
- );
-
---CREATE FOREIGN DATA WRAPPER name
--- [ VALIDATOR valfunction | NO VALIDATOR ]
--- [ OPTIONS ( option 'value' [, ... ] ) ]
-
-CREATE FOREIGN DATA WRAPPER dummy;
-
-CREATE FOREIGN DATA WRAPPER postgresql VALIDATOR postgresql_fdw_validator;
-
-CREATE FOREIGN DATA WRAPPER mywrapper
- OPTIONS (debug 'true');
-
---CREATE [ OR REPLACE ] FUNCTION
--- name ( [ [ argmode ] [ argname ] argtype [ { DEFAULT | = } defexpr ] [, ...] ] )
--- [ RETURNS rettype
--- | RETURNS TABLE ( colname coltype [, ...] ) ]
--- { LANGUAGE langname
--- | WINDOW
--- | IMMUTABLE | STABLE | VOLATILE
--- | CALLED ON NULL INPUT | RETURNS NULL ON NULL INPUT | STRICT
--- | [ EXTERNAL ] SECURITY INVOKER | [ EXTERNAL ] SECURITY DEFINER
--- | COST execution_cost
--- | ROWS result_rows
--- | SET configuration_parameter { TO value | = value | FROM CURRENT }
--- | AS 'definition'
--- | AS 'obj_file', 'link_symbol'
--- } ...
--- [ WITH ( attribute [, ...] ) ]
-
-CREATE FUNCTION add(integer, integer) RETURNS integer
- AS 'select $1 + $2;'
- LANGUAGE SQL
- IMMUTABLE
- RETURNS NULL ON NULL INPUT;
-
-CREATE OR REPLACE FUNCTION increment(i integer) RETURNS integer AS $$
- BEGIN
- RETURN i + 1;
- END;
-
-CREATE FUNCTION dup(in int, out f1 int, out f2 text)
- AS $$ SELECT $1, CAST($1 AS text) || ' is text' $$
- LANGUAGE SQL;
-
-CREATE FUNCTION dup(int) RETURNS dup_result
- AS $$ SELECT $1, CAST($1 AS text) || ' is text' $$
- LANGUAGE SQL;
-
-CREATE FUNCTION dup(int) RETURNS TABLE(f1 int, f2 text)
- AS $$ SELECT $1, CAST($1 AS text) || ' is text' $$
- LANGUAGE SQL;
--- 50 STATEMENTS *******************************************************
---CREATE GROUP name [ [ WITH ] option [ ... ] ]
---where option can be:
--- SUPERUSER | NOSUPERUSER
--- | CREATEDB | NOCREATEDB
--- | CREATEROLE | NOCREATEROLE
--- | CREATEUSER | NOCREATEUSER
--- | INHERIT | NOINHERIT
--- | LOGIN | NOLOGIN
--- | [ ENCRYPTED | UNENCRYPTED ] PASSWORD 'password '
--- | VALID UNTIL 'timestamp'
--- | IN ROLE rolename [, ...]
--- | IN GROUP rolename [, ...]
--- | ROLE rolename [, ...]
--- | ADMIN rolename [, ...]
--- | USER rolename [, ...]
--- | SYSID uid
-
---CREATE [ UNIQUE ] INDEX [ CONCURRENTLY ] name ON table [ USING method ]
--- ( { column | ( expression ) } [ opclass ] [ ASC | DESC ] [ NULLS { FIRST | LAST } ] [, ..
--- [ WITH ( storage_parameter = value [, ... ] ) ]
--- [ TABLESPACE tablespace ]
--- [ WHERE predicate ]
-
-CREATE UNIQUE INDEX title_idx ON films (title);
-
-CREATE INDEX lower_title_idx ON films ((lower(title)));
-
-CREATE INDEX title_idx_nulls_low ON films (title NULLS FIRST);
-
-CREATE UNIQUE INDEX title_idx ON films (title) WITH (fillfactor = 70);
-
-CREATE INDEX gin_idx ON documents_table (locations) WITH (fastupdate = off);
-
-CREATE INDEX code_idx ON films(code) TABLESPACE indexspace;
-
-CREATE INDEX CONCURRENTLY sales_quantity_index ON sales_table (quantity);
-
---CREATE [ PROCEDURAL ] LANGUAGE name
---CREATE [ TRUSTED ] [ PROCEDURAL ] LANGUAGE name
--- HANDLER call_handler [ VALIDATOR valfunction ]
-
-CREATE LANGUAGE plpgsql;
-
-CREATE PROCEDURAL LANGUAGE plpgsql;
-
-CREATE TRUSTED PROCEDURAL LANGUAGE plpgsql;
--- 60 STATEMENTS *******************************************************
-CREATE LANGUAGE plsample
- HANDLER plsample_call_handler;
-
---CREATE OPERATOR name (
--- PROCEDURE = funcname
--- [, LEFTARG = lefttype ] [, RIGHTARG = righttype ]
--- [, COMMUTATOR = com_op ] [, NEGATOR = neg_op ]
--- [, RESTRICT = res_proc ] [, JOIN = join_proc ]
--- [, HASHES ] [, MERGES ]
---)
-
-CREATE OPERATOR === (
- LEFTARG = box,
- RIGHTARG = box,
- PROCEDURE = area_equal_procedure,
- COMMUTATOR = ===,
- NEGATOR = !==,
- RESTRICT = area_restriction_procedure,
- JOIN = area_join_procedure,
- HASHES, MERGES
-);
-
---CREATE OPERATOR CLASS name [ DEFAULT ] FOR TYPE data_type
--- USING index_method [ FAMILY family_name ] AS
--- { OPERATOR strategy_number operator_name [ ( op_type, op_type ) ]
--- | FUNCTION support_number [ ( op_type [ , op_type ] ) ] funcname ( argument_type [, ...] )
--- | STORAGE storage_type
--- } [, ... ]
-
-CREATE OPERATOR CLASS gist__int_ops
- DEFAULT FOR TYPE _int4 USING gist AS
- OPERATOR 3 &&,
- OPERATOR 6 = (anyarray, anyarray),
- OPERATOR 7 @>,
- <@,
- OPERATOR 8
- OPERATOR 20 @@ (_int4, query_int),
- FUNCTION 1 g_int_consistent (internal, _int4, int, oid, internal),
- FUNCTION 2 g_int_union (internal, internal),
- FUNCTION 3 g_int_compress (internal),
- FUNCTION 4 g_int_decompress (internal),
- FUNCTION 5 g_int_penalty (internal, internal, internal),
- FUNCTION 6 g_int_picksplit (internal, internal),
- FUNCTION 7 g_int_same (_int4, _int4, internal);
-
---CREATE OPERATOR FAMILY name USING index_method
-
-CREATE OPERATOR FAMILY name USING index_method;
-
---CREATE ROLE name [ [ WITH ] option [ ... ] ]
---where option can be:
--- SUPERUSER | NOSUPERUSER
--- | CREATEDB | NOCREATEDB
--- | CREATEROLE | NOCREATEROLE
--- | CREATEUSER | NOCREATEUSER
--- | INHERIT | NOINHERIT
--- | LOGIN | NOLOGIN
--- | CONNECTION LIMIT connlimit
--- | [ ENCRYPTED | UNENCRYPTED ] PASSWORD 'password '
--- | VALID UNTIL 'timestamp'
--- | IN ROLE rolename [, ...]
--- | IN GROUP rolename [, ...]
--- | ROLE rolename [, ...]
--- | ADMIN rolename [, ...]
--- | USER rolename [, ...]
--- | SYSID uid
-
-CREATE ROLE jonathan LOGIN;
-
-CREATE USER davide WITH PASSWORD 'jw8s0F4';
-
-CREATE ROLE miriam WITH LOGIN PASSWORD 'jw8s0F4' VALID UNTIL '2005-01-01';
-
-CREATE ROLE admin WITH CREATEDB CREATEROLE;
-
---CREATE [ OR REPLACE ] RULE name AS ON event
--- TO table [ WHERE condition ]
--- DO [ ALSO | INSTEAD ] { NOTHING | command | ( command ; command ... ) }
-
-CREATE RULE "_RETURN" AS
- ON SELECT TO t1
- DO INSTEAD
- SELECT * FROM t2;
-
-CREATE RULE "_RETURN" AS
- ON SELECT TO t2
- DO INSTEAD
- SELECT * FROM t1;
--- 70 STATEMENTS *******************************************************
-CREATE RULE notify_me AS ON UPDATE TO mytable DO ALSO NOTIFY mytable;
-
---CREATE SCHEMA schemaname [ AUTHORIZATION username ] [ schema_element [ ... ] ]
---CREATE SCHEMA AUTHORIZATION username [ schema_element [ ... ] ]
-
-CREATE SCHEMA myschema;
-
-CREATE SCHEMA AUTHORIZATION joe;
-
-CREATE SCHEMA hollywood
- CREATE TABLE films (title text, release date, awards text[])
- CREATE VIEW winners AS
- SELECT title, release FROM films WHERE awards IS NOT NULL;
-
---CREATE [ TEMPORARY | TEMP ] SEQUENCE name [ INCREMENT [ BY ] increment ]
--- [ MINVALUE minvalue | NO MINVALUE ] [ MAXVALUE maxvalue | NO MAXVALUE ]
--- [ START [ WITH ] start ] [ CACHE cache ] [ [ NO ] CYCLE ]
--- [ OWNED BY { table.column | NONE } ]
-
-CREATE SEQUENCE serial START 101;
-
---CREATE SERVER servername [ TYPE 'servertype' ] [ VERSION 'serverversion' ]
--- FOREIGN DATA WRAPPER fdwname
--- [ OPTIONS ( option 'value' [, ... ] ) ]
-
-CREATE SERVER foo FOREIGN DATA WRAPPER "default";
-
-CREATE SERVER myserver FOREIGN DATA WRAPPER pgsql OPTIONS (host 'foo', dbname 'foodb', port);
-
---CREATE [ [ GLOBAL | LOCAL ] { TEMPORARY | TEMP } ] TABLE table_name
--- ( [
--- { column_name data_type [ DEFAULT default_expr ] [ column_constraint [ ... ] ]
--- | table_constraint
--- | LIKE parent_table [ { INCLUDING | EXCLUDING } { DEFAULTS | CONSTRAINTS | INDEXES } ] .
--- [, ... ]
--- ] )
---[ INHERITS ( parent_table [, ... ] ) ]
---[ WITH ( storage_parameter [= value] [, ... ] ) | WITH OIDS | WITHOUT OIDS ]
---[ ON COMMIT { PRESERVE ROWS | DELETE ROWS | DROP } ]
---[ TABLESPACE tablespace ]
---where column_constraint is:
---[ CONSTRAINT constraint_name ]
---{ NOT NULL |
--- NULL |
--- UNIQUE index_parameters |
--- PRIMARY KEY index_parameters |
--- CHECK ( expression ) |
--- REFERENCES reftable [ ( refcolumn ) ] [ MATCH FULL | MATCH PARTIAL | MATCH SIMPLE ]
--- [ ON DELETE action ] [ ON UPDATE action ] }
---[ DEFERRABLE | NOT DEFERRABLE ] [ INITIALLY DEFERRED | INITIALLY IMMEDIATE ]
---and table_constraint is:
---[ CONSTRAINT constraint_name ]
---{ UNIQUE ( column_name [, ... ] ) index_parameters |
--- PRIMARY KEY ( column_name [, ... ] ) index_parameters |
--- CHECK ( expression ) |
--- FOREIGN KEY ( column_name [, ... ] ) REFERENCES reftable [ ( refcolumn [, ... ] ) ]
--- [ MATCH FULL | MATCH PARTIAL | MATCH SIMPLE ] [ ON DELETE action ] [ ON UPDATE action ]
---[ DEFERRABLE | NOT DEFERRABLE ] [ INITIALLY DEFERRED | INITIALLY IMMEDIATE ]
---index_parameters in UNIQUE and PRIMARY KEY constraints are:
---[ WITH ( storage_parameter [= value] [, ... ] ) ]
---[ USING INDEX TABLESPACE tablespace ]
-
-CREATE TABLE films (
- code char(5) CONSTRAINT firstkey PRIMARY KEY,
- title varchar(40) NOT NULL,
- did integer NOT NULL,
- date_prod date,
- kind varchar(10),
- len interval hour to minute
-);
-
-CREATE TABLE distributors (
- did integer PRIMARY KEY DEFAULT nextval('serial'),
- name varchar(40) NOT NULL CHECK (name <> ”)
-
-);
-
-CREATE TABLE array_int (
- vector int[][]
-);
--- 80 STATEMENTS *******************************************************
-CREATE TABLE films (
- code char(5),
- title varchar(40),
- did integer,
- date_prod date,
- kind varchar(10),
- len interval hour to minute,
- CONSTRAINT production UNIQUE(date_prod)
-);
-
-CREATE TABLE distributors (
- did integer CHECK (did > 100),
- name varchar(40)
-);
-
-CREATE TABLE distributors (
- did integer,
- name varchar(40)
- CONSTRAINT con1 CHECK (did > 100 AND name <> ”)
-);
-
-CREATE TABLE films (
- code char(5),
- title varchar(40),
- did integer,
- date_prod date,
- kind varchar(10),
- len interval hour to minute,
- CONSTRAINT code_title PRIMARY KEY(code,title)
-);
-
-CREATE TABLE films (
- code char(5),
- title varchar(40),
- did integer,
- date_prod date,
- kind varchar(10),
- len interval hour to minute,
- CONSTRAINT code_title PRIMARY KEY(code,title)
-);
-
-CREATE TABLE distributors (
- name varchar(40) DEFAULT 'Luso Films',
- did integer DEFAULT nextval('distributors_serial'),
- modtime timestamp DEFAULT current_timestamp
-);
-
-CREATE TABLE distributors (
- did integer CONSTRAINT no_null NOT NULL,
- name varchar(40) NOT NULL
-);
-
-CREATE TABLE distributors (
- did integer,
- name varchar(40) UNIQUE
-);
-
-CREATE TABLE distributors (
- did integer,
- name varchar(40),
- UNIQUE(name)
-);
-
-CREATE TABLE distributors (
- did integer,
- name varchar(40),
- UNIQUE(name) WITH (fillfactor=70)
-)
-WITH (fillfactor=70);
--- 90 STATEMENTS *******************************************************
-CREATE TABLE cinemas (
- id serial,
- name text,
- location text
-) TABLESPACE diskvol1;
-
---CREATE [ [ GLOBAL | LOCAL ] { TEMPORARY | TEMP } ] TABLE table_name
--- [ (column_name [, ...] ) ]
--- [ WITH ( storage_parameter [= value] [, ... ] ) | WITH OIDS | WITHOUT OIDS ]
--- [ ON COMMIT { PRESERVE ROWS | DELETE ROWS | DROP } ]
--- [ TABLESPACE tablespace ]
--- AS query
--- [ WITH [ NO ] DATA ]
-
-CREATE TABLE films_recent AS
- SELECT * FROM films WHERE date_prod >= '2002-01-01';
-
-CREATE TABLE films2 AS
- TABLE films;
-
-CREATE TEMP TABLE films_recent WITH (OIDS) ON COMMIT DROP AS
- EXECUTE recentfilms('2002-01-01');
-
---CREATE TABLESPACE tablespacename [ OWNER username ] LOCATION 'directory '
-
-CREATE TABLESPACE dbspace LOCATION '/data/dbs';
-
-CREATE TABLESPACE indexspace OWNER genevieve LOCATION '/data/indexes';
-
---CREATE TEXT SEARCH CONFIGURATION name (
--- PARSER = parser_name |
--- COPY = source_config
---)
-
-CREATE TEXT SEARCH CONFIGURATION my_search_config (
- PARSER = my_parser
-);
-
---CREATE TEXT SEARCH DICTIONARY name (
--- TEMPLATE = template
--- [, option = value [, ... ]]
---)
-
-CREATE TEXT SEARCH DICTIONARY my_russian (
- template = snowball,
- language = russian,
- stopwords = myrussian
-);
-
---CREATE TEXT SEARCH PARSER name (
--- START = start_function ,
--- GETTOKEN = gettoken_function ,
--- END = end_function ,
--- LEXTYPES = lextypes_function
--- [, HEADLINE = headline_function ]
---)
-
-CREATE TEXT SEARCH PARSER my_search_parser (
- START = startNow(),
- GETTOKEN = getToken(),
- END = end(),
- LEXTYPES = getLexTypes()
-);
-
---CREATE TEXT SEARCH TEMPLATE name (
--- [ INIT = init_function , ]
--- LEXIZE = lexize_function
---)
-
-CREATE TEXT SEARCH TEMPLATE my_search_template (
- LEXIZE = lexizeNow()
-);
-
---CREATE TRIGGER name { BEFORE | AFTER } { event [ OR ... ] }
--- ON table [ FOR [ EACH ] { ROW | STATEMENT } ]
--- EXECUTE PROCEDURE funcname ( arguments )
--- 100 STATEMENTS *******************************************************
-CREATE TRIGGER trigger_name BEFORE dawn
- ON table
- EXECUTE PROCEDURE funcname ( 'arg1', 'arg2' );
-
-ALTER TABLE foreign_companies RENAME COLUMN address TO city;
-
-ALTER TABLE us_companies RENAME TO suppliers;
-
-ALTER TABLE old_addresses ALTER COLUMN street SET NOT NULL;
-
-ALTER TABLE new_addresses ALTER COLUMN street DROP NOT NULL;
-
-GRANT EXECUTE ON FUNCTION divideByTwo(numerator int, IN demoninator int) TO george;
-
--- 106 STATEMENTS *******************************************************
Added: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/revoke_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/revoke_test_statements.ddl (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/revoke_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -0,0 +1,7 @@
+REVOKE SELECT ON TABLE purchaseOrders FROM maria,harry;
+
+REVOKE UPDATE, USAGE ON TABLE orderDetails FROM anita,zhi CASCADE;
+
+REVOKE SELECT ON TABLE orders.bills FROM PUBLIC RESTRICT;
+
+REVOKE INSERT(a, b, c) ON TABLE orderSummaries FROM purchases_reader_role;
\ No newline at end of file
Modified: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl 2010-01-05 20:33:13 UTC (rev 1530)
@@ -36,14 +36,3 @@
CREATE TABLE HOTELAVAILABILITY
(HOTEL_ID INT NOT NULL, BOOKING_DATE DATE NOT NULL,
ROOMS_TAKEN INT DEFAULT 0, PRIMARY KEY (HOTEL_ID, BOOKING_DATE));
-
-GRANT SELECT ON TABLE purchaseOrders TO maria,harry;
-
-GRANT UPDATE, USAGE ON TABLE billedOrders TO anita,zhi;
-
-GRANT SELECT ON TABLE orders.bills to PUBLIC;
-
-GRANT INSERT(a, b, c) ON TABLE purchaseOrders TO purchases_reader_role;
-
-
-
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/test_cnd.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/test_cnd.cnd 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/test_cnd.cnd 2010-01-05 20:33:13 UTC (rev 1530)
@@ -1,32 +0,0 @@
-//------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-
-//------------------------------------------------------------------------------
-// N O D E T Y P E S
-//------------------------------------------------------------------------------
-
-// =============================================================================
-// STATEMENT
-// =============================================================================
-[ddl:statement] mixin abstract
- - ddl:expression (string) mandatory
- + * (ddl:ddlProblem) = ddl:ddlProblem multiple
-
-// =============================================================================
-// CREATE SCHEMA
-// =============================================================================
-[ddl:schemaDefinition] > ddl:statement mixin
- - ddl:defaultCharacterSetName (STRING)
- + * (ddl:statement) = ddl:statement multiple
-
-// =============================================================================
-// DDL PROBLEM
-// =============================================================================
-[ddl:ddlProblem] mixin
- - ddl:problemLevel (LONG) mandatory
- - ddl:message (STRING) mandatory
\ No newline at end of file
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -38,6 +38,7 @@
public static final String DROP = "DROP";
public static final String FOREIGN = "FOREIGN";
public static final String GRANT = "GRANT";
+ public static final String REVOKE = "REVOKE";
public static final String INDEX = "INDEX";
public static final String INSERT = "INSERT";
public static final String UPDATE = "UPDATE";
@@ -99,7 +100,8 @@
public static final String[] STMT_CREATE_TRANSLATION = {CREATE, "TRANSLATION"};
public static final String[] STMT_ALTER_TABLE = {ALTER, TABLE};
public static final String[] STMT_ALTER_DOMAIN = {ALTER, "DOMAIN"};
- public static final String[] STMT_GRANT = {"GRANT"};
+ public static final String[] STMT_GRANT = {GRANT};
+ public static final String[] STMT_REVOKE = {REVOKE};
public static final String[] STMT_DROP_SCHEMA = {DROP, SCHEMA};
public static final String[] STMT_DROP_TABLE = {DROP, TABLE};
public static final String[] STMT_DROP_VIEW = {DROP, VIEW};
@@ -114,9 +116,9 @@
public final static String[][] SQL_92_ALL_PHRASES = {STMT_CREATE_SCHEMA, STMT_CREATE_TABLE,
STMT_CREATE_GLOBAL_TEMPORARY_TABLE, STMT_CREATE_LOCAL_TEMPORARY_TABLE, STMT_CREATE_VIEW, STMT_CREATE_OR_REPLACE_VIEW,
STMT_CREATE_ASSERTION, STMT_CREATE_CHARACTER_SET, STMT_CREATE_COLLATION, STMT_CREATE_TRANSLATION, STMT_CREATE_DOMAIN,
- STMT_ALTER_TABLE, STMT_ALTER_DOMAIN, STMT_GRANT, STMT_DROP_SCHEMA, STMT_DROP_TABLE, STMT_DROP_VIEW, STMT_DROP_DOMAIN,
- STMT_DROP_CHARACTER_SET, STMT_DROP_COLLATION, STMT_DROP_TRANSLATION, STMT_DROP_ASSERTION, STMT_INSERT_INTO,
- STMT_SET_DEFINE};
+ STMT_ALTER_TABLE, STMT_ALTER_DOMAIN, STMT_GRANT, STMT_REVOKE, STMT_DROP_SCHEMA, STMT_DROP_TABLE, STMT_DROP_VIEW,
+ STMT_DROP_DOMAIN, STMT_DROP_CHARACTER_SET, STMT_DROP_COLLATION, STMT_DROP_TRANSLATION, STMT_DROP_ASSERTION,
+ STMT_INSERT_INTO, STMT_SET_DEFINE};
// <schema definition>
// | <table definition>
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -100,6 +100,12 @@
public static final Name TYPE_GRANT_ON_COLLATION_STATEMENT = new BasicName(Namespace.URI, "grantOnCollationStatement");
public static final Name TYPE_GRANT_ON_CHARACTER_SET_STATEMENT = new BasicName(Namespace.URI, "grantOnCharacterSetStatement");
public static final Name TYPE_GRANT_ON_TRANSLATION_STATEMENT = new BasicName(Namespace.URI, "grantOnTranslationStatement");
+ public static final Name TYPE_REVOKE_STATEMENT = new BasicName(Namespace.URI, "revokeStatement");
+ public static final Name TYPE_REVOKE_ON_TABLE_STATEMENT = new BasicName(Namespace.URI, "revokeOnTableStatement");
+ public static final Name TYPE_REVOKE_ON_DOMAIN_STATEMENT = new BasicName(Namespace.URI, "revokeOnDomainStatement");
+ public static final Name TYPE_REVOKE_ON_COLLATION_STATEMENT = new BasicName(Namespace.URI, "revokeOnCollationStatement");
+ public static final Name TYPE_REVOKE_ON_CHARACTER_SET_STATEMENT = new BasicName(Namespace.URI, "revokeOnCharacterSetStatement");
+ public static final Name TYPE_REVOKE_ON_TRANSLATION_STATEMENT = new BasicName(Namespace.URI, "revokeOnTranslationStatement");
public static final Name TYPE_SET_STATEMENT = new BasicName(Namespace.URI, "setStatement");
public static final Name TYPE_INSERT_STATEMENT = new BasicName(Namespace.URI, "insertStatement");
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -276,6 +276,8 @@
stmtNode = parseSetStatement(tokens, node);
} else if (tokens.matches(GRANT)) {
stmtNode = parseGrantStatement(tokens, node);
+ } else if( tokens.matches(REVOKE)) {
+ stmtNode = parseRevokeStatement(tokens, node);
}
if (stmtNode == null) {
@@ -807,39 +809,38 @@
tokens.consume("GRANT");
- if( tokens.canConsume("ALL", "PRIVILEGES")) {
- allPrivileges = true;
- } else {
- parseGrantPrivileges(tokens, privileges);
- }
- tokens.consume("ON");
+ if( tokens.canConsume("ALL", "PRIVILEGES")) {
+ allPrivileges = true;
+ } else {
+ parseGrantPrivileges(tokens, privileges);
+ }
+ tokens.consume("ON");
- if( tokens.canConsume("DOMAIN") ) {
- String name = parseName(tokens);
- grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_DOMAIN_STATEMENT);
- } else if( tokens.canConsume("COLLATION")) {
- String name = parseName(tokens);
- grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_COLLATION_STATEMENT);
- } else if( tokens.canConsume("CHARACTER", "SET")) {
- String name = parseName(tokens);
- grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_CHARACTER_SET_STATEMENT);
- } else if( tokens.canConsume("TRANSLATION")) {
- String name = parseName(tokens);
- grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TRANSLATION_STATEMENT);
- } else {
- tokens.canConsume(TABLE); // OPTIONAL
- String name = parseName(tokens);
- grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TABLE_STATEMENT);
- }
-
+ if( tokens.canConsume("DOMAIN") ) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_DOMAIN_STATEMENT);
+ } else if( tokens.canConsume("COLLATION")) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_COLLATION_STATEMENT);
+ } else if( tokens.canConsume("CHARACTER", "SET")) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_CHARACTER_SET_STATEMENT);
+ } else if( tokens.canConsume("TRANSLATION")) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TRANSLATION_STATEMENT);
+ } else {
+ tokens.canConsume(TABLE); // OPTIONAL
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TABLE_STATEMENT);
+ }
- // Attach privileges to grant node
- for( AstNode node : privileges ) {
- node.setParent(grantNode);
- }
- if( allPrivileges ) {
- grantNode.setProperty(ALL_PRIVILEGES, allPrivileges);
- }
+ // Attach privileges to grant node
+ for( AstNode node : privileges ) {
+ node.setParent(grantNode);
+ }
+ if( allPrivileges ) {
+ grantNode.setProperty(ALL_PRIVILEGES, allPrivileges);
+ }
tokens.consume("TO");
@@ -850,7 +851,7 @@
} while( tokens.canConsume(COMMA));
if( tokens.canConsume("WITH", "GRANT", "OPTION")) {
- grantNode.setProperty(ALL_PRIVILEGES, allPrivileges);
+ grantNode.setProperty(WITH_GRANT_OPTION, "WITH GRANT OPTION");
}
markEndOfStatement(tokens, grantNode);
@@ -909,6 +910,93 @@
} while( tokens.canConsume(COMMA));
}
+
+ protected AstNode parseRevokeStatement( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+ assert tokens.matches(REVOKE);
+
+ markStartOfStatement(tokens);
+
+ // <revoke statement> ::=
+ // REVOKE [ GRANT OPTION FOR ]
+ // <privileges>
+ // ON <object name>
+ // FROM <grantee> [ { <comma> <grantee> }... ] <drop behavior>
+
+ AstNode revokeNode = null;
+ boolean allPrivileges = false;
+ boolean withGrantOption = false;
+
+ List<AstNode> privileges = new ArrayList<AstNode>();
+
+ tokens.consume("REVOKE");
+
+ withGrantOption = tokens.canConsume("WITH", "GRANT", "OPTION");
+
+ if( tokens.canConsume("ALL", "PRIVILEGES")) {
+ allPrivileges = true;
+ } else {
+ parseGrantPrivileges(tokens, privileges);
+ }
+ tokens.consume("ON");
+
+ if( tokens.canConsume("DOMAIN") ) {
+ String name = parseName(tokens);
+ revokeNode = nodeFactory().node(name, parentNode, TYPE_REVOKE_ON_DOMAIN_STATEMENT);
+ } else if( tokens.canConsume("COLLATION")) {
+ String name = parseName(tokens);
+ revokeNode = nodeFactory().node(name, parentNode, TYPE_REVOKE_ON_COLLATION_STATEMENT);
+ } else if( tokens.canConsume("CHARACTER", "SET")) {
+ String name = parseName(tokens);
+ revokeNode = nodeFactory().node(name, parentNode, TYPE_REVOKE_ON_CHARACTER_SET_STATEMENT);
+ } else if( tokens.canConsume("TRANSLATION")) {
+ String name = parseName(tokens);
+ revokeNode = nodeFactory().node(name, parentNode, TYPE_REVOKE_ON_TRANSLATION_STATEMENT);
+ } else {
+ tokens.canConsume(TABLE); // OPTIONAL
+ String name = parseName(tokens);
+ revokeNode = nodeFactory().node(name, parentNode, TYPE_REVOKE_ON_TABLE_STATEMENT);
+ }
+
+ // Attach privileges to grant node
+ for( AstNode node : privileges ) {
+ node.setParent(revokeNode);
+ }
+
+ if( allPrivileges ) {
+ revokeNode.setProperty(ALL_PRIVILEGES, allPrivileges);
+ }
+
+ tokens.consume("FROM");
+
+ do {
+ String grantee = parseName(tokens);
+ nodeFactory().node(grantee, revokeNode, GRANTEE);
+ } while( tokens.canConsume(COMMA));
+
+ String behavior = null;
+
+ if (tokens.canConsume("CASCADE")) {
+ behavior = "CASCADE";
+ } else if (tokens.canConsume("RESTRICT")) {
+ behavior = "RESTRICT";
+ }
+
+ if (behavior != null) {
+ revokeNode.setProperty(DROP_BEHAVIOR, behavior);
+ }
+
+ if( withGrantOption ) {
+ revokeNode.setProperty(WITH_GRANT_OPTION, "WITH GRANT OPTION");
+ }
+
+ markEndOfStatement(tokens, revokeNode);
+
+ return revokeNode;
+ }
+
/**
* Catch-all method to parse unknown (not registered or handled by sub-classes) statements.
*
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlConstants.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlConstants.java 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlConstants.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -199,7 +199,6 @@
static final String[] STMT_NOAUDIT = {"NOAUDIT"};
static final String[] STMT_PURGE = {"PURGE"};
static final String[] STMT_RENAME = {"RENAME"};
- static final String[] STMT_REVOKE = {"REVOKE"};
static final String[] STMT_ROLLBACK_TO_SAVEPOINT = {"ROLLBACK", "TO", "SAVEPOINT"};
static final String[] STMT_ROLLBACK_WORK = {"ROLLBACK", "WORK"};
static final String[] STMT_ROLLBACK = {"ROLLBACK"};
@@ -217,7 +216,7 @@
static final String[][] MISC_PHRASES = {
STMT_ANALYZE, STMT_ASSOCIATE_STATISTICS, STMT_AUDIT, STMT_COMMIT, STMT_COMMENT_ON, STMT_DISASSOCIATE_STATISTICS,
STMT_EXPLAIN_PLAN, STMT_FLASHBACK, STMT_LOCK_TABLE, STMT_MERGE, STMT_NOAUDIT, STMT_PURGE,
- STMT_RENAME, STMT_REVOKE, STMT_ROLLBACK_TO_SAVEPOINT, STMT_ROLLBACK_WORK, STMT_ROLLBACK, STMT_SAVEPOINT, STMT_TRUNCATE
+ STMT_RENAME, STMT_ROLLBACK_TO_SAVEPOINT, STMT_ROLLBACK_WORK, STMT_ROLLBACK, STMT_SAVEPOINT, STMT_TRUNCATE
};
// CREATE TABLE, CREATE VIEW, and GRANT statements.
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -267,8 +267,6 @@
return parseStatement(tokens, STMT_PURGE, parentNode, TYPE_PURGE_STATEMENT);
} else if (tokens.matches(STMT_RENAME)) {
return parseStatement(tokens, STMT_RENAME, parentNode, TYPE_RENAME_STATEMENT);
- } else if (tokens.matches(STMT_REVOKE)) {
- return parseStatement(tokens, STMT_REVOKE, parentNode, TYPE_REVOKE_STATEMENT);
} else if (tokens.matches(STMT_ROLLBACK)) {
return parseStatement(tokens, STMT_ROLLBACK, parentNode, TYPE_ROLLBACK_STATEMENT);
} else if (tokens.matches(STMT_ROLLBACK_WORK)) {
@@ -580,7 +578,19 @@
// return super.parseGrantStatement(tokens, parentNode);
}
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.sequencer.ddl.StandardDdlParser#parseRevokeStatement(org.jboss.dna.sequencer.ddl.DdlTokenStream, org.jboss.dna.sequencer.ddl.node.AstNode)
+ */
@Override
+ protected AstNode parseRevokeStatement( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ return parseStatement(tokens, STMT_REVOKE, parentNode, TYPE_REVOKE_STATEMENT);
+ }
+
+ @Override
protected AstNode parseAlterTableStatement( DdlTokenStream tokens,
AstNode parentNode ) throws ParsingException {
assert tokens != null;
Modified: trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java 2010-01-05 19:16:20 UTC (rev 1529)
+++ trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java 2010-01-05 20:33:13 UTC (rev 1530)
@@ -847,4 +847,17 @@
assertThat(true, is(success));
assertThat(rootNode.getChildCount(), is(4));
}
+
+ @Test
+ public void shouldParseRevokeStatements() {
+ printTest("shouldParseRevokeStatements()");
+ String content = "REVOKE SELECT ON TABLE purchaseOrders FROM maria,harry;" + NEWLINE
+ + "REVOKE UPDATE, USAGE ON TABLE purchaseOrders FROM anita,zhi CASCADE;" + NEWLINE
+ + "REVOKE SELECT ON TABLE orders.bills FROM PUBLIC RESTRICT;" + NEWLINE
+ + "REVOKE INSERT(a, b, c) ON TABLE purchaseOrders FROM purchases_reader_role;";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(4));
+ }
}
14 years, 4 months
DNA SVN: r1529 - trunk.
by dna-commits@lists.jboss.org
Author: bcarothers
Date: 2010-01-05 14:16:20 -0500 (Tue, 05 Jan 2010)
New Revision: 1529
Removed:
trunk/dna-search/
Log:
DNA-622 Remove from SVN the 'dna-search' project, which contains only empty directories
Removing the empty directories as per the defect.
14 years, 4 months
DNA SVN: r1528 - in trunk/dna-integration-tests/src/test: resources/org/jboss/dna/test/integration/sequencer/ddl and 1 other directory.
by dna-commits@lists.jboss.org
Author: blafond
Date: 2010-01-05 08:33:29 -0500 (Tue, 05 Jan 2010)
New Revision: 1528
Added:
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl
Removed:
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd
Modified:
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd
trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl
Log:
DNA-49 integration test changes for additional parsing. Includes parsing for Grant statements in Standard, Derby & Postgres
Modified: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java 2010-01-05 13:31:47 UTC (rev 1527)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/sequencer/ddl/DdlSequencerIntegrationTest.java 2010-01-05 13:33:29 UTC (rev 1528)
@@ -23,6 +23,7 @@
*/
package org.jboss.dna.test.integration.sequencer.ddl;
+import static org.junit.Assert.assertNotNull;
import static org.hamcrest.core.Is.is;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertThat;
@@ -199,6 +200,53 @@
}
@Test
+ public void shouldSequenceStandardDdlFile() throws Exception {
+ System.out.println("STARTED: shouldSequenceStandardDdlFile(standard_test_statements.ddl)");
+ URL url = getUrl(cndDdlFolder + "standard_test_statements.ddl");
+ uploadFile(url);
+
+ waitUntilSequencedNodesIs(1);
+
+ // Find the node ...
+ Node root = session.getRootNode();
+
+ if (root.hasNode("ddls") ) {
+ if (root.hasNode("ddls")) {
+ Node ddlsNode = root.getNode("ddls");
+ //System.out.println(" | NAME: " + ddlsNode.getName() + " PATH: " + ddlsNode.getPath());
+ for (NodeIterator iter = ddlsNode.getNodes(); iter.hasNext();) {
+ Node ddlNode = iter.nextNode();
+
+ //printNodeProperties(ddlNode);
+
+ long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
+ assertEquals(numStatements, 11);
+
+ //GRANT SELECT ON TABLE purchaseOrders TO maria,harry;
+ Node grantNode = findNode(ddlNode, "purchaseOrders", "ddl:grantOnTableStatement");
+ assertNotNull(grantNode);
+ Node granteeNode = findNode(grantNode, "maria", "ddl:grantee");
+ assertNotNull(granteeNode);
+ Node privNode = findNode(grantNode, "privilege", "ddl:grantPrivilege");
+ assertNotNull(privNode);
+ verifySingleValueProperty(privNode, "ddl:type", "SELECT");
+
+ //GRANT UPDATE, USAGE ON TABLE purchaseOrders TO anita,zhi;
+ grantNode = findNode(ddlNode, "billedOrders", "ddl:grantOnTableStatement");
+ assertNotNull(grantNode);
+ privNode = findNode(grantNode, "privilege", "ddl:grantPrivilege");
+ assertNotNull(privNode);
+ verifySingleValueProperty(privNode, "ddl:type", "UPDATE");
+ granteeNode = findNode(grantNode, "anita", "ddl:grantee");
+ assertNotNull(granteeNode);
+ }
+ }
+ }
+
+ System.out.println("FINISHED: shouldSequenceStandardDdlFile(create_schema.ddl)");
+ }
+
+ @Test
public void shouldSequenceDerbyDdlFile() throws Exception {
System.out.println("STARTED: shouldSequenceDerbyDdlFile(derby_test_statements.ddl)");
URL url = getUrl(cndDdlFolder + "derby_test_statements.ddl");
@@ -225,6 +273,89 @@
verifyNode(ddlNode, "SAMP.DEPARTMENT", "ddl:expression");
verifyNode(ddlNode, "HOTEL_ID", "ddl:datatypeName");
verifyNode(ddlNode, "CITIES", "ddl:startLineNumber");
+
+ // Create Function
+ verifyNode(ddlNode, "PROPERTY_FILE_READER", "ddl:startLineNumber", 71);
+ verifyNodeTypes(ddlNode, "PROPERTY_FILE_READER",
+ "derbyddl:createFunctionStatement",
+ "ddl:creatable",
+ "derbyddl:functionOperand");
+ verifyNode(ddlNode, "KEY_COL", "ddl:datatypeName", "VARCHAR");
+
+ Node functionNode = findNode(ddlNode, "TO_DEGREES");
+ assertNotNull(functionNode);
+ verifyChildNode(functionNode, "parameterStyle", "ddl:value", "PARAMETER STYLE JAVA");
+
+ // Create Index
+ // CREATE INDEX IXSALE ON SAMP.SALES (SALES);
+ Node indexNode = findNode(ddlNode, "IXSALE", "derbyddl:createIndexStatement");
+ assertNotNull(indexNode);
+ verifySimpleStringProperty(indexNode, "derbyddl:tableName", "SAMP.SALES");
+ Node colRefNode = findNode(indexNode, "SALES");
+ assertNotNull(colRefNode);
+ colRefNode = findNode(ddlNode, "SALES", "derbyddl:indexColumnReference");
+ assertNotNull(colRefNode);
+ verifyNodeTypes(colRefNode, "SALES",
+ "derbyddl:indexColumnReference",
+ "ddl:columnReference",
+ "ddl:referenceOperand");
+
+ // declare global temporary table SESSION.t1(c11 int) not logged;
+ Node ttNode = findNode(ddlNode, "SESSION.t1", "derbyddl:declareGlobalTemporaryTableStatement");
+ assertNotNull(ttNode);
+ Node colNode = findNode(ttNode, "c11");
+ assertNotNull(colNode);
+ verifySimpleStringProperty(colNode, "ddl:datatypeName", "int");
+
+ // LOCK TABLE FlightAvailability IN EXCLUSIVE MODE;
+ Node lockNode = findNode(ddlNode, "FlightAvailability", "derbyddl:lockTableStatement");
+ assertNotNull(lockNode);
+ Node optionNode = findNode(lockNode, "lockMode");
+ assertNotNull(optionNode);
+ verifySimpleStringProperty(optionNode, "ddl:value", "EXCLUSIVE");
+
+ // RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT
+ Node renameTableNode = findNode(ddlNode, "SAMP.EMP_ACT", "derbyddl:renameTableStatement");
+ assertNotNull(renameTableNode);
+ verifySimpleStringProperty(renameTableNode, "ddl:newName", "EMPLOYEE_ACT");
+
+ // CREATE SYNONYM SAMP.T1 FOR SAMP.TABLEWITHLONGNAME;
+ Node synonymNode = findNode(ddlNode, "SAMP.T1", "derbyddl:createSynonymStatement");
+ assertNotNull(synonymNode);
+ verifySimpleStringProperty(synonymNode, "derbyddl:tableName", "SAMP.TABLEWITHLONGNAME");
+
+ //CREATE TRIGGER FLIGHTSDELETE3
+ // AFTER DELETE ON FLIGHTS
+ // REFERENCING OLD AS OLD
+ // FOR EACH ROW
+ // DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID = OLD.FLIGHT_ID;
+ Node triggerNode = findNode(ddlNode, "FLIGHTSDELETE3", "derbyddl:createTriggerStatement");
+ assertNotNull(triggerNode);
+ verifySimpleStringProperty(triggerNode, "derbyddl:tableName", "FLIGHTS");
+
+ //CREATE TRIGGER t1 NO CASCADE BEFORE UPDATE ON x
+ // FOR EACH ROW MODE DB2SQL
+ // values app.notifyEmail('Jerry', 'Table x is about to be updated');
+ triggerNode = findNode(ddlNode, "t1", "derbyddl:createTriggerStatement");
+ assertNotNull(triggerNode);
+ verifySimpleStringProperty(triggerNode, "derbyddl:tableName", "x");
+ optionNode = findNode(triggerNode, "forEach");
+ assertNotNull(optionNode);
+ verifySimpleStringProperty(optionNode, "ddl:value", "FOR EACH ROW");
+ optionNode = findNode(triggerNode, "eventType");
+ assertNotNull(optionNode);
+ verifySimpleStringProperty(optionNode, "ddl:value", "UPDATE");
+
+ //GRANT EXECUTE ON PROCEDURE p TO george;
+ Node grantNode = findNode(ddlNode, "p", "derbyddl:grantOnProcedureStatement");
+ assertNotNull(grantNode);
+
+ //GRANT purchases_reader_role TO george,maria;
+ grantNode = findNode(ddlNode, "grantRoles", "derbyddl:grantRolesStatement");
+ assertNotNull(grantNode);
+ Node roleNode = findNode(grantNode, "george", "ddl:grantee");
+ assertNotNull(roleNode);
+
}
}
}
@@ -288,7 +419,7 @@
Node ddlNode = iter.nextNode();
long numStatements = ddlNode.getNodes().nextNode().getNodes().getSize();
- assertEquals(numStatements, 101);
+ assertEquals(numStatements, 106);
//printNodeProperties(ddlNode);
@@ -307,6 +438,20 @@
verifyNode(ddlNode, "my_function", "ddl:startLineNumber", 44);
verifyNode(ddlNode, "my_function", "ddl:startCharIndex", 1573);
verifyNode(ddlNode, "my_function", "postgresddl:comment", "'Returns Roman Numeral'");
+
+ //ALTER TABLE foreign_companies RENAME COLUMN address TO city;
+ Node alterTableNode = findNode(ddlNode, "foreign_companies", "postgresddl:alterTableStatement");
+ assertNotNull(alterTableNode);
+ Node renameColNode = findNode(alterTableNode, "address","postgresddl:renamedColumn");
+ assertNotNull(renameColNode);
+ verifySingleValueProperty(renameColNode, "ddl:newName", "city");
+
+ //GRANT EXECUTE ON FUNCTION divideByTwo(numerator int, IN demoninator int) TO george;
+ Node grantNode = findNode(ddlNode, "divideByTwo", "postgresddl:grantOnFunctionStatement");
+ assertNotNull(grantNode);
+ Node parameter_1 = findNode(grantNode, "numerator","postgresddl:functionParameter");
+ assertNotNull(parameter_1);
+ verifySingleValueProperty(parameter_1, "ddl:datatypeName", "int");
}
}
}
@@ -345,12 +490,33 @@
public void printNodeProperties(Node node) throws Exception {
printProperties(node);
+
for (NodeIterator iter = node.getNodes(); iter.hasNext();) {
printNodeProperties(iter.nextNode());
}
}
+ private void verifyChildNode(Node parentNode, String childNodeName, String propName, String expectedValue) throws Exception {
+ // Find child node
+ Node childNode = null;
+ for (NodeIterator iter = parentNode.getNodes(); iter.hasNext();) {
+ Node nextNode = iter.nextNode();
+ if( nextNode.getName().equals(childNodeName)) {
+ childNode = nextNode;
+ break;
+ }
+ }
+ if( childNode != null ) {
+ assertThat( childNode.hasProperty(propName), is(true));
+ verifySingleValueProperty(childNode, propName, expectedValue);
+
+ } else {
+ fail("NODE: " + childNodeName + " not found");
+ }
+
+ }
+
private void verifyNode(Node topNode, String name, String propName) throws Exception {
Node node = findNode(topNode, name);
@@ -362,6 +528,11 @@
}
+ private void verifySimpleStringProperty(Node node, String propName, String expectedValue) throws Exception {
+ assertThat( node.hasProperty(propName), is(true));
+ verifySingleValueProperty(node, propName, expectedValue);
+ }
+
private void verifyNode(Node topNode, String name, String propName, String expectedValue) throws Exception {
Node node = findNode(topNode, name);
@@ -433,15 +604,19 @@
}
}
- private void verifyMixin(Node node, String nodeType) throws Exception {
- boolean foundMixin = false;
+ private boolean hasMixin(Node node, String nodeType) throws Exception {
for( NodeType mixin : node.getMixinNodeTypes() ) {
String mixinName = mixin.getName();
if( mixinName.equals(nodeType) ) {
- foundMixin = true;
- break;
+ return true;
}
}
+ return false;
+ }
+
+ private void verifyMixin(Node node, String nodeType) throws Exception {
+ boolean foundMixin = hasMixin(node, nodeType);
+
assertThat(foundMixin, is(true));
}
@@ -457,6 +632,20 @@
}
+ private void verifyNodeTypes(Node topNode, String nodeName, String nodeTypeName, String...moreNodeTypeNames) throws Exception {
+ Node node = findNode(topNode, nodeName);
+
+ if( node != null ) {
+ assertThat(node.isNodeType(nodeTypeName), is(true));
+ for( String nextTypeName : moreNodeTypeNames ) {
+ assertThat(node.isNodeType(nextTypeName), is(true));
+ }
+ } else {
+ fail("NODE: " + nodeName + " not found");
+ }
+
+ }
+
private Node findNode(Node node, String name) throws Exception {
if( node.getName().equals(name)) {
return node;
@@ -475,6 +664,24 @@
return null;
}
+ private Node findNode(Node node, String name, String type) throws Exception {
+ if( node.getName().equals(name) && node.isNodeType(type)) { //(hasMixin(node, type) || node.isNodeType(type))) {
+ return node;
+ }
+ for (NodeIterator iter = node.getNodes(); iter.hasNext();) {
+ Node nextNode = iter.nextNode();
+ if( nextNode.getName().equals(name) && node.isNodeType(type)) { //(hasMixin(node, type) || node.isNodeType(type))) {
+ return nextNode;
+ }
+ Node someNode = findNode(nextNode, name, type);
+ if( someNode != null ) {
+ return someNode;
+ }
+ }
+
+ return null;
+ }
+
private void printProperties( Node node ) throws RepositoryException, PathNotFoundException, ValueFormatException {
System.out.println("\n >>> NODE PATH: " + node.getPath() );
Modified: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/DerbyDdl.cnd 2010-01-05 13:33:29 UTC (rev 1528)
@@ -41,26 +41,44 @@
[derbyddl:synonymOperand] > ddl:operand abstract
[derbyddl:triggerOperand] > ddl:operand abstract
+[derbyddl:roleName] > derbyddl:roleOperand mixin
+
// =============================================================================
// COLUMN
// =============================================================================
[derbyddl:columnDefinition] > ddl:columnDefinition mixin
- derbyddl:dropDefault (boolean)
+[derbyddl:functionParameter] > ddl:columnDefinition mixin
+
+[derbyddl:indexColumnReference] > ddl:columnReference mixin
+ - derbyddl:order (STRING)
+
// =============================================================================
// CREATE STATEMENTS
// =============================================================================
[derbyddl:createFunctionStatement] > ddl:creatable, ddl:statement, derbyddl:functionOperand mixin
-[derbyddl:createIndex] > ddl:statement, ddl:creatable, derbyddl:indexOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:isTableType (boolean)
+ + * (derbyddl:functionParameter) = derbyddl:functionParameter multiple
+ + * (ddl:statementOption) = ddl:statementOption multiple
+[derbyddl:createIndexStatement] > ddl:statement, ddl:creatable, derbyddl:indexOperand mixin
- derbyddl:tableName (string) mandatory
- derbyddl:unique (boolean)
- + * (ddl:columnReference) = ddl:columnReference multiple
+ + * (derbyddl:indexColumnReference) = derbyddl:indexColumnReference multiple
[derbyddl:createProcedureStatement] > ddl:creatable, ddl:statement, derbyddl:procedureOperand mixin
[derbyddl:createRoleStatement] > ddl:creatable, ddl:statement, derbyddl:roleOperand mixin
[derbyddl:createSynonymStatement] > ddl:creatable, ddl:statement, derbyddl:synonymOperand mixin
+ - derbyddl:tableName (string) mandatory
[derbyddl:createTriggerStatement] > ddl:creatable, ddl:statement, derbyddl:triggerOperand mixin
+ - derbyddl:tableName (string) mandatory
+ - ddl:sql (string) mandatory
+ + * (ddl:columnReference) = ddl:columnreference multiple
+[derbyddl:declareGlobalTemporaryTableStatement] > ddl:createTableStatement mixin
-
// =============================================================================
// DROP STATEMENTS
// =============================================================================
@@ -70,3 +88,15 @@
[derbyddl:dropRoleStatement] > ddl:droppable, derbyddl:roleOperand mixin
[derbyddl:dropSynonymStatement] > ddl:droppable, derbyddl:synonymOperand mixin
[derbyddl:dropTriggerStatement] > ddl:droppable, derbyddl:triggerOperand mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+[derbyddl:lockTableStatement] > ddl:statement, ddl:tableOperand mixin
+[derbyddl:renameTableStatement] > ddl:statement, ddl:renamable, ddl:tableOperand mixin
+
+[derbyddl:grantOnFunctionStatement] > ddl:grantStatement, derbyddl:functionOperand mixin
+[derbyddl:grantOnProcedureStatement] > ddl:grantStatement, derbyddl:procedureOperand mixin
+
+[derbyddl:grantRolesStatement] > ddl:grantStatement mixin
+ + ddl:name (derbyddl:roleName) = derbyddl:roleName multiple
\ No newline at end of file
Deleted: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd 2010-01-05 13:33:29 UTC (rev 1528)
@@ -1,178 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-
- //------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-<postgresddl='http://www.jboss.org/dna/ddl/postgres/1.0'>
-
-// =============================================================================
-// OPERANDS
-// =============================================================================
-[postgresddl:aggregateOperand] > ddl:operand abstract
-[postgresddl:castOperand] > ddl:operand abstract
-[postgresddl:commentOperand] > ddl:operand abstract
-[postgresddl:constraintTriggerOperand] > ddl:operand abstract
-[postgresddl:conversionOperand] > ddl:operand abstract
-[postgresddl:databaseOperand] > ddl:operand abstract
-[postgresddl:foreignDataOperand] > ddl:operand abstract
-[postgresddl:groupOperand] > ddl:operand abstract
-[postgresddl:functionOperand] > ddl:operand abstract
-[postgresddl:indexOperand] > ddl:operand abstract
-[postgresddl:languageOperand] > ddl:operand abstract
-[postgresddl:operatorOperand] > ddl:operand abstract
-[postgresddl:ownedByOperand] > ddl:operand abstract
-[postgresddl:roleOperand] > ddl:operand abstract
-[postgresddl:ruleOperand] > ddl:operand abstract
-[postgresddl:sequenceOperand] > ddl:operand abstract
-[postgresddl:serverOperand] > ddl:operand abstract
-[postgresddl:tablespaceOperand] > ddl:operand abstract
-[postgresddl:textSearchOperand] > ddl:operand abstract
-[postgresddl:triggerOperand] > ddl:operand abstract
-[postgresddl:typeOperand] > ddl:operand abstract
-[postgresddl:userOperand] > ddl:operand abstract
-[postgresddl:userMappingOperand] > ddl:operand abstract
-
-
-// =============================================================================
-// ALTER STATEMENTS
-// =============================================================================
-[postgresddl:alterAggregateStatement] > ddl:alterable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:alterConversionStatement] > ddl:alterable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:alterForeignDataWrapperStatement] > ddl:alterable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:alterFunctionStatement] > ddl:alterable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:alterGroupStatement] > ddl:alterable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:alterIndexStatement] > ddl:alterable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:alterLanguageStatement] > ddl:alterable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:alterOperatorStatement] > ddl:alterable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:alterRoleStatement] > ddl:alterable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:alterSchemaStatement] > ddl:alterable, ddl:statement, ddl:schemaOperand mixin
-[postgresddl:alterSequenceStatement] > ddl:alterable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:alterServerStatement] > ddl:alterable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:alterTextSearchStatement] > ddl:alterable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:alterTriggerStatement] > ddl:alterable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:alterTypeStatement] > ddl:alterable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:alterUserStatement] > ddl:alterable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:alterUserMappingStatement] > ddl:alterable, ddl:statement, postgresddl:userMappingOperand mixin
-[postgresddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
-
-[postgresddl:alterTableStatement] > ddl:alterTableStatement mixin
- - postgresddl:newTableName (STRING)
- - postgresddl:schemaName (STRING)
- + postgresddl:renameColumn (ddl:renamable) = ddl:renamable multiple
-
-
-// =============================================================================
-// CREATE STATEMENTS
-// =============================================================================
-
-[postgresddl:createAggregateStatement] > ddl:creatable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:createCastStatement] > ddl:creatable, ddl:statement, postgresddl:castOperand mixin
-[postgresddl:createConstraintTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:constraintTriggerOperand mixin
-[postgresddl:createConversionStatement] > ddl:creatable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:createDatabaseStatement] > ddl:creatable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:createForeignDataWrapperStatement] > ddl:creatable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:createFunctionStatement] > ddl:creatable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:createGroupStatement] > ddl:creatable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:createIndexStatement] > ddl:creatable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:createLanguageStatement] > ddl:creatable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:createOperatorStatement] > ddl:creatable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:createRoleStatement] > ddl:creatable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:createRuleStatement] > ddl:creatable, ddl:statement, postgresddl:ruleOperand mixin
-[postgresddl:createSequenceStatement] > ddl:creatable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:createServerStatement] > ddl:creatable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:createTablespaceStatement] > ddl:creatable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:createTextSearchStatement] > ddl:creatable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:createTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:createTypeStatement] > ddl:creatable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:createUserStatement] > ddl:creatable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:createUserMappingStatement] > ddl:creatable, ddl:statement, postgresddl:userMappingOperand mixin
-
-// =============================================================================
-// DROP STATEMENTS
-// =============================================================================
-
-[postgresddl:dropAggregateStatement] > ddl:droppable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:dropCastStatement] > ddl:droppable, ddl:statement, postgresddl:castOperand mixin
-[postgresddl:dropConstraintTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:constraintTriggerOperand mixin
-[postgresddl:dropConversionStatement] > ddl:droppable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:dropForeignDataWrapperStatement] > ddl:droppable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:dropFunctionStatement] > ddl:droppable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:dropGroupStatement] > ddl:droppable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:dropIndexStatement] > ddl:droppable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:dropLanguageStatement] > ddl:droppable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:dropOperatorStatement] > ddl:droppable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:dropOwnedByStatement] > ddl:droppable, ddl:statement, postgresddl:ownedByOperand mixin
-[postgresddl:dropRoleStatement] > ddl:droppable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:dropRuleStatement] > ddl:droppable, ddl:statement, postgresddl:ruleOperand mixin
-[postgresddl:dropSequenceStatement] > ddl:droppable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:dropServerStatement] > ddl:droppable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:dropTextSearchStatement] > ddl:droppable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:dropTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:dropTypeStatement] > ddl:droppable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:dropUserStatement] > ddl:droppable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:dropUserMappingStatement] > ddl:droppable, ddl:statement, postgresddl:userMappingOperand mixin
-
-// =============================================================================
-// MISC STATEMENTS
-// =============================================================================
-
-[postgresddl:abortStatement] > ddl:statement mixin
-[postgresddl:analyzeStatement] > ddl:statement mixin
-[postgresddl:clusterStatement] > ddl:statement mixin
-[postgresddl:commentOnStatement] > ddl:statement, postgresddl:commentOperand mixin
- - postgresddl:targetObjectType (STRING) mandatory
- - postgresddl:targetObjectName (STRING)
- - postgresddl:comment (STRING) mandatory
-[postgresddl:copyStatement] > ddl:statement mixin
-[postgresddl:deallocateStatement] > ddl:statement mixin
-[postgresddl:declareStatement] > ddl:statement mixin
-[postgresddl:discardStatement] > ddl:statement mixin
-[postgresddl:explainStatement] > ddl:statement mixin
-[postgresddl:fetchStatement] > ddl:statement mixin
-[postgresddl:listenStatement] > ddl:statement mixin
-[postgresddl:loadStatement] > ddl:statement mixin
-[postgresddl:lockTableStatement] > ddl:statement mixin
-[postgresddl:moveStatement] > ddl:statement mixin
-[postgresddl:notifyStatement] > ddl:statement mixin
-[postgresddl:prepareStatement] > ddl:statement mixin
-[postgresddl:reassignOwnedStatement] > ddl:statement mixin
-[postgresddl:reindexStatement] > ddl:statement mixin
-[postgresddl:releaseSavepointStatement] > ddl:statement mixin
-[postgresddl:rollbackStatement] > ddl:statement mixin
-[postgresddl:selectIntoStatement] > ddl:statement mixin
-[postgresddl:showStatement] > ddl:statement mixin
-[postgresddl:truncateStatement] > ddl:statement mixin
-[postgresddl:unlistenStatement] > ddl:statement mixin
-[postgresddl:vacuumStatement] > ddl:statement mixin
-
-
Added: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd 2010-01-05 13:33:29 UTC (rev 1528)
@@ -0,0 +1,205 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+
+ //------------------------------------------------------------------------------
+// N A M E S P A C E S
+//------------------------------------------------------------------------------
+<jcr='http://www.jcp.org/jcr/1.0'>
+<nt='http://www.jcp.org/jcr/nt/1.0'>
+<mix='http://www.jcp.org/jcr/mix/1.0'>
+<ddl='http://www.jboss.org/dna/ddl/1.0'>
+<postgresddl='http://www.jboss.org/dna/ddl/postgres/1.0'>
+
+// =============================================================================
+// OPERANDS
+// =============================================================================
+[postgresddl:aggregateOperand] > ddl:operand abstract
+[postgresddl:castOperand] > ddl:operand abstract
+[postgresddl:commentOperand] > ddl:operand abstract
+[postgresddl:constraintTriggerOperand] > ddl:operand abstract
+[postgresddl:conversionOperand] > ddl:operand abstract
+[postgresddl:databaseOperand] > ddl:operand abstract
+[postgresddl:foreignDataOperand] > ddl:operand abstract
+[postgresddl:groupOperand] > ddl:operand abstract
+[postgresddl:functionOperand] > ddl:operand abstract
+[postgresddl:indexOperand] > ddl:operand abstract
+[postgresddl:languageOperand] > ddl:operand abstract
+[postgresddl:operatorOperand] > ddl:operand abstract
+[postgresddl:ownedByOperand] > ddl:operand abstract
+[postgresddl:roleOperand] > ddl:operand abstract
+[postgresddl:ruleOperand] > ddl:operand abstract
+[postgresddl:sequenceOperand] > ddl:operand abstract
+[postgresddl:serverOperand] > ddl:operand abstract
+[postgresddl:tablespaceOperand] > ddl:operand abstract
+[postgresddl:textSearchOperand] > ddl:operand abstract
+[postgresddl:triggerOperand] > ddl:operand abstract
+[postgresddl:typeOperand] > ddl:operand abstract
+[postgresddl:userOperand] > ddl:operand abstract
+[postgresddl:userMappingOperand] > ddl:operand abstract
+[postgresddl:parameterOperand] > ddl:operand abstract
+
+[postgresddl:functionParameter] > postgresddl:parameterOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:nullable (STRING)
+ - ddl:defaultOption (STRING)
+ - postgresddl:mode (STRING)
+
+[postgresddl:role] > postgresddl:roleOperand mixin
+
+[postgresddl:renamedColumn] > ddl:renamable mixin
+
+// =============================================================================
+// ALTER STATEMENTS
+// =============================================================================
+[postgresddl:alterAggregateStatement] > ddl:alterable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:alterConversionStatement] > ddl:alterable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:alterForeignDataWrapperStatement] > ddl:alterable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:alterFunctionStatement] > ddl:alterable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:alterGroupStatement] > ddl:alterable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:alterIndexStatement] > ddl:alterable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:alterLanguageStatement] > ddl:alterable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:alterOperatorStatement] > ddl:alterable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:alterRoleStatement] > ddl:alterable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:alterSchemaStatement] > ddl:alterable, ddl:statement, ddl:schemaOperand mixin
+[postgresddl:alterSequenceStatement] > ddl:alterable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:alterServerStatement] > ddl:alterable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:alterTextSearchStatement] > ddl:alterable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:alterTriggerStatement] > ddl:alterable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:alterTypeStatement] > ddl:alterable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:alterUserStatement] > ddl:alterable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:alterUserMappingStatement] > ddl:alterable, ddl:statement, postgresddl:userMappingOperand mixin
+[postgresddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
+
+[postgresddl:alterTableStatement] > ddl:alterTableStatement mixin
+ - postgresddl:newTableName (STRING)
+ - postgresddl:schemaName (STRING)
+ + postgresddl:renameColumn (postgresddl:renamedColumn) = postgresddl:renamedColumn multiple
+
+
+// =============================================================================
+// CREATE STATEMENTS
+// =============================================================================
+
+[postgresddl:createAggregateStatement] > ddl:creatable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:createCastStatement] > ddl:creatable, ddl:statement, postgresddl:castOperand mixin
+[postgresddl:createConstraintTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:constraintTriggerOperand mixin
+[postgresddl:createConversionStatement] > ddl:creatable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:createDatabaseStatement] > ddl:creatable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:createForeignDataWrapperStatement] > ddl:creatable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:createFunctionStatement] > ddl:creatable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:createGroupStatement] > ddl:creatable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:createIndexStatement] > ddl:creatable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:createLanguageStatement] > ddl:creatable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:createOperatorStatement] > ddl:creatable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:createRoleStatement] > ddl:creatable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:createRuleStatement] > ddl:creatable, ddl:statement, postgresddl:ruleOperand mixin
+[postgresddl:createSequenceStatement] > ddl:creatable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:createServerStatement] > ddl:creatable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:createTablespaceStatement] > ddl:creatable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:createTextSearchStatement] > ddl:creatable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:createTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:createTypeStatement] > ddl:creatable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:createUserStatement] > ddl:creatable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:createUserMappingStatement] > ddl:creatable, ddl:statement, postgresddl:userMappingOperand mixin
+
+// =============================================================================
+// DROP STATEMENTS
+// =============================================================================
+
+[postgresddl:dropAggregateStatement] > ddl:droppable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:dropCastStatement] > ddl:droppable, ddl:statement, postgresddl:castOperand mixin
+[postgresddl:dropConstraintTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:constraintTriggerOperand mixin
+[postgresddl:dropConversionStatement] > ddl:droppable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:dropForeignDataWrapperStatement] > ddl:droppable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:dropFunctionStatement] > ddl:droppable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:dropGroupStatement] > ddl:droppable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:dropIndexStatement] > ddl:droppable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:dropLanguageStatement] > ddl:droppable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:dropOperatorStatement] > ddl:droppable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:dropOwnedByStatement] > ddl:droppable, ddl:statement, postgresddl:ownedByOperand mixin
+[postgresddl:dropRoleStatement] > ddl:droppable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:dropRuleStatement] > ddl:droppable, ddl:statement, postgresddl:ruleOperand mixin
+[postgresddl:dropSequenceStatement] > ddl:droppable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:dropServerStatement] > ddl:droppable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:dropTextSearchStatement] > ddl:droppable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:dropTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:dropTypeStatement] > ddl:droppable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:dropUserStatement] > ddl:droppable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:dropUserMappingStatement] > ddl:droppable, ddl:statement, postgresddl:userMappingOperand mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+
+[postgresddl:abortStatement] > ddl:statement mixin
+[postgresddl:analyzeStatement] > ddl:statement mixin
+[postgresddl:clusterStatement] > ddl:statement mixin
+[postgresddl:commentOnStatement] > ddl:statement, postgresddl:commentOperand mixin
+ - postgresddl:targetObjectType (STRING) mandatory
+ - postgresddl:targetObjectName (STRING)
+ - postgresddl:comment (STRING) mandatory
+[postgresddl:copyStatement] > ddl:statement mixin
+[postgresddl:deallocateStatement] > ddl:statement mixin
+[postgresddl:declareStatement] > ddl:statement mixin
+[postgresddl:discardStatement] > ddl:statement mixin
+[postgresddl:explainStatement] > ddl:statement mixin
+[postgresddl:fetchStatement] > ddl:statement mixin
+[postgresddl:listenStatement] > ddl:statement mixin
+[postgresddl:loadStatement] > ddl:statement mixin
+[postgresddl:lockTableStatement] > ddl:statement mixin
+[postgresddl:moveStatement] > ddl:statement mixin
+[postgresddl:notifyStatement] > ddl:statement mixin
+[postgresddl:prepareStatement] > ddl:statement mixin
+[postgresddl:reassignOwnedStatement] > ddl:statement mixin
+[postgresddl:reindexStatement] > ddl:statement mixin
+[postgresddl:releaseSavepointStatement] > ddl:statement mixin
+[postgresddl:rollbackStatement] > ddl:statement mixin
+[postgresddl:selectIntoStatement] > ddl:statement mixin
+[postgresddl:showStatement] > ddl:statement mixin
+[postgresddl:truncateStatement] > ddl:statement mixin
+[postgresddl:unlistenStatement] > ddl:statement mixin
+[postgresddl:vacuumStatement] > ddl:statement mixin
+
+// =============================================================================
+// GRANT STATEMENTS
+// =============================================================================
+[postgresddl:grantOnTableStatement] > ddl:grantStatement, ddl:tableOperand mixin
+[postgresddl:grantOnSequenceStatement] > ddl:grantStatement, postgresddl:sequenceOperand mixin
+[postgresddl:grantOnDatabaseStatement] > ddl:grantStatement, postgresddl:databaseOperand mixin
+[postgresddl:grantOnForeignDataWrapperStatement] > ddl:grantStatement, postgresddl:foreignDataOperand mixin
+[postgresddl:grantOnForeignServerStatement] > ddl:grantStatement, postgresddl:serverOperand mixin
+[postgresddl:grantOnFunctionStatement] > ddl:grantStatement, postgresddl:functionOperand mixin
+ + postgresddl:parameter (postgresddl:functionParameter) = postgresddl:functionParameter multiple
+[postgresddl:grantOnLanguageStatement] > ddl:grantStatement, postgresddl:languageOperand mixin
+[postgresddl:grantOnSchemaStatement] > ddl:grantStatement, ddl:schemaOperand mixin
+[postgresddl:grantOnTablespaceStatement] > ddl:grantStatement, postgresddl:tablespaceOperand mixin
+[postgresddl:grantRolesStatement] > ddl:grantStatement mixin
+ + postgresddl:grantRole (postgresddl:role) = postgresddl:role multiple
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/PostgresDdl.cnd
___________________________________________________________________
Name: svn:executable
+ *
Modified: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/StandardDdl.cnd 2010-01-05 13:33:29 UTC (rev 1528)
@@ -63,7 +63,7 @@
[ddl:settable] > ddl:operation abstract
[ddl:grantable] > ddl:operation abstract
[ddl:revokable] > ddl:operation abstract
-[ddl:renamable] > ddl:operation, ddl:operand abstract
+[ddl:renamable] > ddl:operation abstract
- ddl:newName (STRING)
// =============================================================================
@@ -119,6 +119,7 @@
[ddl:columnReference] > ddl:referenceOperand mixin
[ddl:tableReference] > ddl:referenceOperand mixin
[ddl:fkColumnReference] > ddl:referenceOperand mixin
+[ddl:grantee] > ddl:referenceOperand mixin
// =============================================================================
// SIMPLE STRING PROPERTY
@@ -258,9 +259,22 @@
[ddl:insertStatement] > ddl:statement, ddl:insertable mixin
// TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
-
-[ddl:grantStatement] > ddl:statement, ddl:grantable mixin
- // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+// =============================================================================
+// GRANT STATEMENTS
+// =============================================================================
+[ddl:grantPrivilege] mixin
+ - ddl:type (STRING) mandatory
+ + * (ddl:columnReference) = ddl:columnReference multiple
+[ddl:grantStatement] > ddl:statement, ddl:grantable mixin
+ - ddl:allPrivileges (boolean)
+ + * (ddl:grantPrivilege) = ddl:grantPrivilege multiple
+ + * (ddl:grantee) = ddl:grantee multiple
+
+[ddl:grantOnTableStatement] > ddl:grantStatement, ddl:tableOperand mixin
+[ddl:grantOnDomainStatement] > ddl:grantStatement, ddl:domainOperand mixin
+[ddl:grantOnCollationStatement] > ddl:grantStatement, ddl:collationOperand mixin
+[ddl:grantOnCharacterSetStatement] > ddl:grantStatement, ddl:characterSetOperand mixin
+[ddl:grantOnTranslationStatement] > ddl:grantStatement, ddl:translationOperand mixin
Modified: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl 2010-01-05 13:31:47 UTC (rev 1527)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/postgres_test_statements.ddl 2010-01-05 13:33:29 UTC (rev 1528)
@@ -595,4 +595,15 @@
CREATE TRIGGER trigger_name BEFORE dawn
ON table
EXECUTE PROCEDURE funcname ( 'arg1', 'arg2' );
--- 101 STATEMENTS *******************************************************
\ No newline at end of file
+
+ALTER TABLE foreign_companies RENAME COLUMN address TO city;
+
+ALTER TABLE us_companies RENAME TO suppliers;
+
+ALTER TABLE old_addresses ALTER COLUMN street SET NOT NULL;
+
+ALTER TABLE new_addresses ALTER COLUMN street DROP NOT NULL;
+
+GRANT EXECUTE ON FUNCTION divideByTwo(numerator int, IN demoninator int) TO george;
+
+-- 106 STATEMENTS *******************************************************
Added: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl
===================================================================
--- trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl (rev 0)
+++ trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl 2010-01-05 13:33:29 UTC (rev 1528)
@@ -0,0 +1,49 @@
+CREATE TABLE MORE_ACTIVITIES (CITY_ID INT NOT NULL,
+ SEASON VARCHAR(20), ACTIVITY VARCHAR(32) NOT NULL);
+
+CREATE SCHEMA AUTHORIZATION oe
+ CREATE TABLE new_product
+ (color VARCHAR(10) PRIMARY KEY, quantity NUMERIC)
+ CREATE VIEW new_product_view
+ AS SELECT color, quantity FROM new_product WHERE color = 'RED'
+ GRANT select ON new_product_view TO hr;
+
+CREATE TABLE PEOPLE
+ (PERSON_ID INT NOT NULL CONSTRAINT PEOPLE_PK PRIMARY KEY, PERSON VARCHAR(26));
+
+CREATE SCHEMA schema_name_1
+ CREATE TABLE table_name_15 (
+ column_name_1 VARCHAR(255)
+ REFERENCES ref_table_name (ref_column_name_1)
+ ON UPDATE NO ACTION )
+ CREATE VIEW SAMP.V1 (COL_SUM, COL_DIFF)
+ AS SELECT COMM + BONUS, COMM - BONUS
+ FROM SAMP.EMPLOYEE
+ CREATE TABLE table_name26 (
+ column_name_1 VARCHAR(255),
+ UNIQUE (ref_column_name_1));
+
+CREATE TABLE table_name29 (
+ column_name_1 VARCHAR(255),
+ CONSTRAINT fk_name FOREIGN KEY (ref_column_name_1, ref_column_name_2)
+ REFERENCES ref_table_name (ref_column_name_1)
+ ON DELETE CASCADE ON UPDATE SET NULL
+ MATCH FULL);
+
+CREATE TABLE ACTIVITIES (CITY_ID INT NOT NULL,
+ SEASON VARCHAR(20), ACTIVITY VARCHAR(32) NOT NULL)
+
+CREATE TABLE HOTELAVAILABILITY
+ (HOTEL_ID INT NOT NULL, BOOKING_DATE DATE NOT NULL,
+ ROOMS_TAKEN INT DEFAULT 0, PRIMARY KEY (HOTEL_ID, BOOKING_DATE));
+
+GRANT SELECT ON TABLE purchaseOrders TO maria,harry;
+
+GRANT UPDATE, USAGE ON TABLE billedOrders TO anita,zhi;
+
+GRANT SELECT ON TABLE orders.bills to PUBLIC;
+
+GRANT INSERT(a, b, c) ON TABLE purchaseOrders TO purchases_reader_role;
+
+
+
Property changes on: trunk/dna-integration-tests/src/test/resources/org/jboss/dna/test/integration/sequencer/ddl/standard_test_statements.ddl
___________________________________________________________________
Name: svn:executable
+ *
14 years, 4 months
DNA SVN: r1527 - in trunk/extensions/dna-sequencer-ddl/src: main/java/org/jboss/dna/sequencer/ddl/dialect/derby and 10 other directories.
by dna-commits@lists.jboss.org
Author: blafond
Date: 2010-01-05 08:31:47 -0500 (Tue, 05 Jan 2010)
New Revision: 1527
Added:
trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd
trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd
trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd
Removed:
trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd
trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd
trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd
Modified:
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlConstants.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParser.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlConstants.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParser.java
trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.properties
trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlParserTestHelper.java
trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlSequencerTest.java
trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java
trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParserTest.java
trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParserTest.java
trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/dialect/postgres/postgres_test_statements_4.ddl
trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/standardDdlTest.ddl
Log:
DNA-49 Adding additional parsing and tests. Includes parsing for Grant statements in Standard, Derby & Postgres
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlConstants.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -40,6 +40,9 @@
public static final String GRANT = "GRANT";
public static final String INDEX = "INDEX";
public static final String INSERT = "INSERT";
+ public static final String UPDATE = "UPDATE";
+ public static final String DELETE = "DELETE";
+ public static final String SELECT = "SELECT";
public static final String KEY = "KEY";
public static final String OFF = "OFF";
public static final String ON = "ON";
@@ -125,7 +128,9 @@
// | <translation definition>
// | <assertion definition>
public final static Name[] VALID_SCHEMA_CHILD_TYPES = {StandardDdlLexicon.TYPE_CREATE_TABLE_STATEMENT,
- StandardDdlLexicon.TYPE_CREATE_VIEW_STATEMENT, StandardDdlLexicon.TYPE_GRANT_STATEMENT,
+ StandardDdlLexicon.TYPE_CREATE_VIEW_STATEMENT, StandardDdlLexicon.TYPE_GRANT_ON_TABLE_STATEMENT,
+ StandardDdlLexicon.TYPE_GRANT_ON_DOMAIN_STATEMENT, StandardDdlLexicon.TYPE_GRANT_ON_TRANSLATION_STATEMENT,
+ StandardDdlLexicon.TYPE_GRANT_ON_COLLATION_STATEMENT, StandardDdlLexicon.TYPE_GRANT_ON_CHARACTER_SET_STATEMENT,
StandardDdlLexicon.TYPE_CREATE_DOMAIN_STATEMENT, StandardDdlLexicon.TYPE_CREATE_CHARACTER_SET_STATEMENT,
StandardDdlLexicon.TYPE_CREATE_COLLATION_STATEMENT, StandardDdlLexicon.TYPE_CREATE_TRANSLATION_STATEMENT,
StandardDdlLexicon.TYPE_CREATE_ASSERTION_STATEMENT};
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -42,11 +42,11 @@
public static I18n unusedTokensParsingColumnDefinition;
public static I18n alterTableOptionNotFound;
public static I18n unusedTokensParsingCreateIndex;
+ public static I18n missingReturnTypeForFunction;
// public static I18n
// public static I18n
// public static I18n
// public static I18n
-// public static I18n
static {
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlLexicon.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -95,6 +95,11 @@
public static final Name TYPE_ALTER_TABLE_STATEMENT = new BasicName(Namespace.URI, "alterTableStatement");
public static final Name TYPE_ALTER_DOMAIN_STATEMENT = new BasicName(Namespace.URI, "alterDomainStatement");
public static final Name TYPE_GRANT_STATEMENT = new BasicName(Namespace.URI, "grantStatement");
+ public static final Name TYPE_GRANT_ON_TABLE_STATEMENT = new BasicName(Namespace.URI, "grantOnTableStatement");
+ public static final Name TYPE_GRANT_ON_DOMAIN_STATEMENT = new BasicName(Namespace.URI, "grantOnDomainStatement");
+ public static final Name TYPE_GRANT_ON_COLLATION_STATEMENT = new BasicName(Namespace.URI, "grantOnCollationStatement");
+ public static final Name TYPE_GRANT_ON_CHARACTER_SET_STATEMENT = new BasicName(Namespace.URI, "grantOnCharacterSetStatement");
+ public static final Name TYPE_GRANT_ON_TRANSLATION_STATEMENT = new BasicName(Namespace.URI, "grantOnTranslationStatement");
public static final Name TYPE_SET_STATEMENT = new BasicName(Namespace.URI, "setStatement");
public static final Name TYPE_INSERT_STATEMENT = new BasicName(Namespace.URI, "insertStatement");
@@ -122,18 +127,21 @@
public static final Name TYPE_FK_COLUMN_REFERENCE = new BasicName(Namespace.URI, "fkColumnReference");
public static final Name TYPE_CLAUSE = new BasicName(Namespace.URI, "clause");
- public static final Name DDL_EXPRESSION = new BasicName(Namespace.URI, "expression");
- public static final Name DDL_ORIGINAL_EXPRESSION = new BasicName(Namespace.URI, "originalExpression");
- public static final Name DDL_START_LINE_NUMBER = new BasicName(Namespace.URI, "startLineNumber");
- public static final Name DDL_START_COLUMN_NUMBER = new BasicName(Namespace.URI, "startColumnNumber");
- public static final Name DDL_START_CHAR_INDEX = new BasicName(Namespace.URI, "startCharIndex");
+ public static final Name DDL_EXPRESSION = new BasicName(Namespace.URI, "expression");
+ public static final Name DDL_ORIGINAL_EXPRESSION = new BasicName(Namespace.URI, "originalExpression");
+ public static final Name DDL_START_LINE_NUMBER = new BasicName(Namespace.URI, "startLineNumber");
+ public static final Name DDL_START_COLUMN_NUMBER = new BasicName(Namespace.URI, "startColumnNumber");
+ public static final Name DDL_START_CHAR_INDEX = new BasicName(Namespace.URI, "startCharIndex");
// public static final Name DDL_LENGTH = new BasicName(Namespace.URI, "length");
/*
* node property names
*/
public static final Name NAME = new BasicName(Namespace.URI, "name");
+ public static final Name OPTION = new BasicName(Namespace.URI, "option");
+ public static final Name TYPE = new BasicName(Namespace.URI, "type");
public static final Name NEW_NAME = new BasicName(Namespace.URI, "newName");
+ public static final Name SQL = new BasicName(Namespace.URI, "sql");
public static final Name TEMPORARY = new BasicName(Namespace.URI, "temporary");
public static final Name ON_COMMIT_VALUE = new BasicName(Namespace.URI, "onCommitValue");
public static final Name COLUMN_DEFINITIONS = new BasicName(Namespace.URI, "datatypeScale");
@@ -153,6 +161,10 @@
public static final Name DROP_BEHAVIOR = new BasicName(Namespace.URI, "dropBehavior");
public static final Name PROPERTY_VALUE = new BasicName(Namespace.URI, "propValue");
public static final Name PROBLEM_LEVEL = new BasicName(Namespace.URI, "problemLevel");
+ public static final Name GRANT_PRIVILEGE = new BasicName(Namespace.URI, "grantPrivilege");
+ public static final Name ALL_PRIVILEGES = new BasicName(Namespace.URI, "allPrivileges");
+ public static final Name WITH_GRANT_OPTION = new BasicName(Namespace.URI, "withGrantOption");
+ public static final Name GRANTEE = new BasicName(Namespace.URI, "grantee");
public static final Name CREATE_VIEW_QUERY_EXPRESSION = new BasicName(Namespace.URI, "queryExpression");
public static final Name CREATE_VIEW_OPTION_CLAUSE = new BasicName(Namespace.URI, "createViewOption");
@@ -163,15 +175,7 @@
* node child types
*/
- // public static final Name COLUMN_REFERENCE_TYPE = new BasicName(Namespace.URI, "columnReference");
- // public static final Name TABLE_CONSTRAINT_TYPE = new BasicName(Namespace.URI, "tableConstraint");
- // public static final Name STATEMENT_OPTION_TYPE = new BasicName(Namespace.URI, "statementOption");
public static final Name DROP_OPTION_TYPE = new BasicName(Namespace.URI, "dropOption");
- // public static final Name DROP_COLUMN_DEFINITION_TYPE = new BasicName(Namespace.URI, "dropColumnDefinition");
- // public static final Name DROP_TABLE_CONSTRAINT_TYPE = new BasicName(Namespace.URI, "dropTableConstraint");
- // public static final Name ALTER_COLUMN_DEFINITION_TYPE = new BasicName(Namespace.URI, "alterColumnDefinition");
- // public static final Name TABLE_REFERENCE_TYPE = new BasicName(Namespace.URI, "tableReference ");
- // public static final Name FK_COLUMN_REFERENCE_TYPE = new BasicName(Namespace.URI, "fkColumnReference ");
public static final Name COLUMN_ATTRIBUTE_TYPE = new BasicName(Namespace.URI, "columnAttribute");
public static final Name CONSTRAINT_ATTRIBUTE_TYPE = new BasicName(Namespace.URI, "constraintAttribute ");
}
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/StandardDdlParser.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -29,60 +29,7 @@
*/
package org.jboss.dna.sequencer.ddl;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.CHECK_SEARCH_CONDITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.COLLATION_NAME;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.CONSTRAINT_ATTRIBUTE_TYPE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.CONSTRAINT_TYPE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.CREATE_VIEW_QUERY_EXPRESSION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_EXPRESSION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_START_CHAR_INDEX;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_START_COLUMN_NUMBER;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_START_LINE_NUMBER;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DEFAULT_OPTION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DEFAULT_PRECISION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DEFAULT_VALUE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DROP_BEHAVIOR;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.MESSAGE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.NAME;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.NULLABLE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.PROBLEM_LEVEL;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.PROPERTY_VALUE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TEMPORARY;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_ADD_TABLE_CONSTRAINT_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_ALTER_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_ALTER_DOMAIN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_ALTER_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_COLUMN_REFERENCE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_ASSERTION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_CHARACTER_SET_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_COLLATION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_DOMAIN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_SCHEMA_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_TRANSLATION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_VIEW_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_ASSERTION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_CHARACTER_SET_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_COLLATION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_DOMAIN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_SCHEMA_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_TABLE_CONSTRAINT_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_TRANSLATION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_VIEW_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_FK_COLUMN_REFERENCE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_GRANT_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_INSERT_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_MISSING_TERMINATOR;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_PROBLEM;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_SET_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_STATEMENT_OPTION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_TABLE_CONSTRAINT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_TABLE_REFERENCE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.VALUE;
+import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.*;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Collections;
@@ -828,24 +775,140 @@
AstNode parentNode ) throws ParsingException {
assert tokens != null;
assert parentNode != null;
-
- // Original implementation does NOT parse Insert statement, but just returns a generic TypedStatement
+ assert tokens.matches(GRANT);
+
markStartOfStatement(tokens);
- tokens.consume(GRANT);
- String name = GRANT;
+ // Syntax for tables
+ //
+ // GRANT <privileges> ON <object name>
+ // TO <grantee> [ { <comma> <grantee> }... ]
+ // [ WITH GRANT OPTION ]
+ //
+ // <object name> ::=
+ // [ TABLE ] <table name>
+ // | DOMAIN <domain name>
+ // | COLLATION <collation name>
+ // | CHARACTER SET <character set name>
+ // | TRANSLATION <translation name>
+ //
+ // Syntax for roles
+ //
+ // GRANT roleName [ {, roleName }* ] TO grantees
+
+ // privilege-types
+ //
+ // ALL PRIVILEGES | privilege-list
+ //
+ AstNode grantNode = null;
+ boolean allPrivileges = false;
- tokens.consume(); // First Privilege token
+ List<AstNode> privileges = new ArrayList<AstNode>();
- AstNode node = nodeFactory().node(name, parentNode, TYPE_GRANT_STATEMENT);
+ tokens.consume("GRANT");
- parseUntilTerminator(tokens);
+ if( tokens.canConsume("ALL", "PRIVILEGES")) {
+ allPrivileges = true;
+ } else {
+ parseGrantPrivileges(tokens, privileges);
+ }
+ tokens.consume("ON");
- markEndOfStatement(tokens, node);
+ if( tokens.canConsume("DOMAIN") ) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_DOMAIN_STATEMENT);
+ } else if( tokens.canConsume("COLLATION")) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_COLLATION_STATEMENT);
+ } else if( tokens.canConsume("CHARACTER", "SET")) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_CHARACTER_SET_STATEMENT);
+ } else if( tokens.canConsume("TRANSLATION")) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TRANSLATION_STATEMENT);
+ } else {
+ tokens.canConsume(TABLE); // OPTIONAL
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TABLE_STATEMENT);
+ }
- return node;
+
+ // Attach privileges to grant node
+ for( AstNode node : privileges ) {
+ node.setParent(grantNode);
+ }
+ if( allPrivileges ) {
+ grantNode.setProperty(ALL_PRIVILEGES, allPrivileges);
+ }
+
+
+ tokens.consume("TO");
+
+ do {
+ String grantee = parseName(tokens);
+ nodeFactory().node(grantee, grantNode, GRANTEE);
+ } while( tokens.canConsume(COMMA));
+
+ if( tokens.canConsume("WITH", "GRANT", "OPTION")) {
+ grantNode.setProperty(ALL_PRIVILEGES, allPrivileges);
+ }
+
+ markEndOfStatement(tokens, grantNode);
+
+ return grantNode;
}
+ protected void parseGrantPrivileges( DdlTokenStream tokens, List<AstNode> privileges) throws ParsingException {
+ // privilege-types
+ //
+ // ALL PRIVILEGES | privilege-list
+ //
+ // privilege-list
+ //
+ // table-privilege {, table-privilege }*
+ //
+ // table-privilege
+ // SELECT
+ // | DELETE
+ // | INSERT [ <left paren> <privilege column list> <right paren> ]
+ // | UPDATE [ <left paren> <privilege column list> <right paren> ]
+ // | REFERENCES [ <left paren> <privilege column list> <right paren> ]
+ // | USAGE
+
+ do {
+ AstNode node = null;
+
+ if( tokens.canConsume(DELETE)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, DELETE);
+ } else if( tokens.canConsume(INSERT)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, INSERT);
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ } else if( tokens.canConsume("REFERENCES")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "REFERENCES");
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ } else if( tokens.canConsume(SELECT)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, SELECT);
+ } else if( tokens.canConsume("USAGE")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "USAGE");
+ } else if( tokens.canConsume(UPDATE)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, UPDATE);
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ }
+ if( node == null) {
+ break;
+ }
+ nodeFactory().setType(node, GRANT_PRIVILEGE);
+ privileges.add(node);
+
+ } while( tokens.canConsume(COMMA));
+
+ }
/**
* Catch-all method to parse unknown (not registered or handled by sub-classes) statements.
*
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlConstants.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlConstants.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlConstants.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -96,7 +96,7 @@
public final static Name[] VALID_SCHEMA_CHILD_STMTS = {
StandardDdlLexicon.TYPE_CREATE_TABLE_STATEMENT,
StandardDdlLexicon.TYPE_CREATE_VIEW_STATEMENT,
- StandardDdlLexicon.TYPE_GRANT_STATEMENT
+ StandardDdlLexicon.TYPE_GRANT_ON_TABLE_STATEMENT
};
}
interface DerbyDataTypes {
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -41,50 +41,59 @@
*
* SQL-92 Spec
*
- * CREATE SCHEMA
- * CREATE DOMAIN
- * CREATE [ { GLOBAL | LOCAL } TEMPORARY ] TABLE
- * CREATE VIEW
- * CREATE ASSERTION
- * CREATE CHARACTER SET
- * CREATE COLLATION
- * CREATE TRANSLATION
- * ===> CREATE FUNCTION
- * ===> CREATE INDEX
- * ===> CREATE PROCEDURE
- * ===> CREATE ROLE
- * ===> CREATE SYNONYM
- * ===> CREATE TRIGGER
- *
- * ALTER TABLE
- *
- * ===> GRANT
- * ===> LOCK TABLE
- * ===> RENAME TABLE
- * ===> RENAME INDEX
- * ===> SET
- * ===> DECLARE GLOBAL TEMPORARY TABLE
+ * CREATE SCHEMA
+ * CREATE DOMAIN
+ * CREATE [ { GLOBAL | LOCAL } TEMPORARY ] TABLE
+ * CREATE VIEW
+ * CREATE ASSERTION
+ * CREATE CHARACTER SET
+ * CREATE COLLATION
+ * CREATE TRANSLATION
+ * ===> CREATE FUNCTION
+ * ===> CREATE INDEX
+ * ===> CREATE PROCEDURE
+ * ===> CREATE ROLE
+ * ===> CREATE SYNONYM
+ * ===> CREATE TRIGGER
+ *
+ * ALTER TABLE
+ *
+ * ===> GRANT
+ * ===> LOCK TABLE
+ * ===> RENAME TABLE
+ * ===> RENAME INDEX
+ * ===> SET
+ * ===> DECLARE GLOBAL TEMPORARY TABLE
*/
- public static final Name TYPE_CREATE_FUNCTION_STATEMENT = new BasicName(Namespace.URI, "createFunctionStatement");
- public static final Name TYPE_CREATE_INDEX_STATEMENT = new BasicName(Namespace.URI, "createIndexStatement");
- public static final Name TYPE_CREATE_PROCEDURE_STATEMENT = new BasicName(Namespace.URI, "createProcedureStatement");
- public static final Name TYPE_CREATE_ROLE_STATEMENT = new BasicName(Namespace.URI, "createRoleStatement");
- public static final Name TYPE_CREATE_SYNONYM_STATEMENT = new BasicName(Namespace.URI, "createSynonymStatement");
- public static final Name TYPE_CREATE_TRIGGER_STATEMENT = new BasicName(Namespace.URI, "createTriggerStatement");
- public static final Name TYPE_LOCK_TABLE_STATEMENT = new BasicName(Namespace.URI, "lockTableStatement");
- public static final Name TYPE_RENAME_TABLE_STATEMENT = new BasicName(Namespace.URI, "renameTableStatement");
- public static final Name TYPE_RENAME_INDEX_STATEMENT = new BasicName(Namespace.URI, "renameIndexStatement");
+ public static final Name TYPE_CREATE_FUNCTION_STATEMENT = new BasicName(Namespace.URI, "createFunctionStatement");
+ public static final Name TYPE_CREATE_INDEX_STATEMENT = new BasicName(Namespace.URI, "createIndexStatement");
+ public static final Name TYPE_CREATE_PROCEDURE_STATEMENT = new BasicName(Namespace.URI, "createProcedureStatement");
+ public static final Name TYPE_CREATE_ROLE_STATEMENT = new BasicName(Namespace.URI, "createRoleStatement");
+ public static final Name TYPE_CREATE_SYNONYM_STATEMENT = new BasicName(Namespace.URI, "createSynonymStatement");
+ public static final Name TYPE_CREATE_TRIGGER_STATEMENT = new BasicName(Namespace.URI, "createTriggerStatement");
+ public static final Name TYPE_LOCK_TABLE_STATEMENT = new BasicName(Namespace.URI, "lockTableStatement");
+ public static final Name TYPE_RENAME_TABLE_STATEMENT = new BasicName(Namespace.URI, "renameTableStatement");
+ public static final Name TYPE_RENAME_INDEX_STATEMENT = new BasicName(Namespace.URI, "renameIndexStatement");
public static final Name TYPE_DECLARE_GLOBAL_TEMPORARY_TABLE_STATEMENT = new BasicName(Namespace.URI, "declareGlobalTemporaryTableStatement");
- public static final Name TYPE_DROP_FUNCTION_STATEMENT = new BasicName(Namespace.URI, "dropFunctionStatement");
- public static final Name TYPE_DROP_INDEX_STATEMENT = new BasicName(Namespace.URI, "dropIndexStatement");
- public static final Name TYPE_DROP_PROCEDURE_STATEMENT = new BasicName(Namespace.URI, "dropProcedureStatement");
- public static final Name TYPE_DROP_ROLE_STATEMENT = new BasicName(Namespace.URI, "dropRoleStatement");
- public static final Name TYPE_DROP_SYNONYM_STATEMENT = new BasicName(Namespace.URI, "dropSynonymStatement");
- public static final Name TYPE_DROP_TRIGGER_STATEMENT = new BasicName(Namespace.URI, "dropTriggerStatement");
+ public static final Name TYPE_DROP_FUNCTION_STATEMENT = new BasicName(Namespace.URI, "dropFunctionStatement");
+ public static final Name TYPE_DROP_INDEX_STATEMENT = new BasicName(Namespace.URI, "dropIndexStatement");
+ public static final Name TYPE_DROP_PROCEDURE_STATEMENT = new BasicName(Namespace.URI, "dropProcedureStatement");
+ public static final Name TYPE_DROP_ROLE_STATEMENT = new BasicName(Namespace.URI, "dropRoleStatement");
+ public static final Name TYPE_DROP_SYNONYM_STATEMENT = new BasicName(Namespace.URI, "dropSynonymStatement");
+ public static final Name TYPE_DROP_TRIGGER_STATEMENT = new BasicName(Namespace.URI, "dropTriggerStatement");
- public static final Name UNIQUE_INDEX = new BasicName(Namespace.URI, "unique");
- public static final Name TABLE_NAME = new BasicName(Namespace.URI, "tableName");
- public static final Name GENERATED_COLUMN_SPEC_CLAUSE = new BasicName(Namespace.URI, "generatedColumnSpecClause");
+ public static final Name TYPE_FUNCTION_PARAMETER = new BasicName(Namespace.URI, "functionParameter");
+ public static final Name TYPE_INDEX_COLUMN_REFERENCE = new BasicName(Namespace.URI, "indexColumnReference");
+ public static final Name TYPE_GRANT_ON_FUNCTION_STATEMENT = new BasicName(Namespace.URI, "grantOnFunctionStatement");
+ public static final Name TYPE_GRANT_ON_PROCEDURE_STATEMENT = new BasicName(Namespace.URI, "grantOnProcedureStatement");
+ public static final Name TYPE_GRANT_ROLES_STATEMENT = new BasicName(Namespace.URI, "grantRolesStatement");
+
+ public static final Name UNIQUE_INDEX = new BasicName(Namespace.URI, "unique");
+ public static final Name ORDER = new BasicName(Namespace.URI, "order");
+ public static final Name TABLE_NAME = new BasicName(Namespace.URI, "tableName");
+ public static final Name ROLE_NAME = new BasicName(Namespace.URI, "roleName");
+ public static final Name GENERATED_COLUMN_SPEC_CLAUSE = new BasicName(Namespace.URI, "generatedColumnSpecClause");
+ public static final Name IS_TABLE_TYPE = new BasicName(Namespace.URI, "isTableType");
}
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParser.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParser.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParser.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -1,26 +1,7 @@
package org.jboss.dna.sequencer.ddl.dialect.derby;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.COLUMN_ATTRIBUTE_TYPE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.PROPERTY_VALUE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_ALTER_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_ALTER_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_TABLE_CONSTRAINT_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TABLE_NAME;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_CREATE_FUNCTION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_CREATE_INDEX_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_CREATE_PROCEDURE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_CREATE_ROLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_CREATE_SYNONYM_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_CREATE_TRIGGER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_DROP_FUNCTION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_DROP_INDEX_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_DROP_PROCEDURE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_DROP_ROLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_DROP_SYNONYM_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.TYPE_DROP_TRIGGER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.UNIQUE_INDEX;
+import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.*;
+import static org.jboss.dna.sequencer.ddl.dialect.derby.DerbyDdlLexicon.*;
import java.util.ArrayList;
import java.util.List;
import org.jboss.dna.common.text.ParsingException;
@@ -38,7 +19,9 @@
/**
* Derby-specific DDL Parser. Includes custom data types as well as custom DDL statements.
*/
-public class DerbyDdlParser extends StandardDdlParser implements DerbyDdlConstants {
+public class DerbyDdlParser extends StandardDdlParser
+ implements DerbyDdlConstants,
+ DerbyDdlConstants.DerbyStatementStartPhrases {
private final String parserId = "DERBY";
static List<String[]> derbyDataTypeStrings = new ArrayList<String[]>();
@@ -69,11 +52,11 @@
public void registerWords( DdlTokenStream tokens ) {
tokens.registerKeyWords(CUSTOM_KEYWORDS);
- tokens.registerStatementStartPhrase(DerbyStatementStartPhrases.ALTER_PHRASES);
- tokens.registerStatementStartPhrase(DerbyStatementStartPhrases.CREATE_PHRASES);
- tokens.registerStatementStartPhrase(DerbyStatementStartPhrases.DROP_PHRASES);
- tokens.registerStatementStartPhrase(DerbyStatementStartPhrases.SET_PHRASES);
- tokens.registerStatementStartPhrase(DerbyStatementStartPhrases.MISC_PHRASES);
+ tokens.registerStatementStartPhrase(ALTER_PHRASES);
+ tokens.registerStatementStartPhrase(CREATE_PHRASES);
+ tokens.registerStatementStartPhrase(DROP_PHRASES);
+ tokens.registerStatementStartPhrase(SET_PHRASES);
+ tokens.registerStatementStartPhrase(MISC_PHRASES);
super.registerWords(tokens);
}
@@ -94,7 +77,7 @@
*/
@Override
protected Name[] getValidSchemaChildTypes() {
- return DerbyStatementStartPhrases.VALID_SCHEMA_CHILD_STMTS;
+ return VALID_SCHEMA_CHILD_STMTS;
}
/**
@@ -111,34 +94,14 @@
AstNode result = super.parseCustomStatement(tokens, parentNode);
if (result == null) {
- if (tokens.matches(DerbyStatementStartPhrases.STMT_LOCK_TABLE)) {
- markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_LOCK_TABLE);
- result = parseIgnorableStatement(tokens,
- getStatementTypeName(DerbyStatementStartPhrases.STMT_LOCK_TABLE),
- parentNode);
- markEndOfStatement(tokens, result);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_RENAME_TABLE)) {
- markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_RENAME_TABLE);
- result = parseIgnorableStatement(tokens,
- getStatementTypeName(DerbyStatementStartPhrases.STMT_RENAME_TABLE),
- parentNode);
- markEndOfStatement(tokens, result);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_RENAME_INDEX)) {
- markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_RENAME_INDEX);
- result = parseIgnorableStatement(tokens,
- getStatementTypeName(DerbyStatementStartPhrases.STMT_RENAME_INDEX),
- parentNode);
- markEndOfStatement(tokens, result);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_DECLARE_GLOBAL_TEMP_TABLE)) {
- markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_DECLARE_GLOBAL_TEMP_TABLE);
- result = parseIgnorableStatement(tokens,
- getStatementTypeName(DerbyStatementStartPhrases.STMT_DECLARE_GLOBAL_TEMP_TABLE),
- parentNode);
- markEndOfStatement(tokens, result);
+ if (tokens.matches(STMT_LOCK_TABLE)) {
+ result = parseLockTable(tokens, parentNode);
+ } else if (tokens.matches(STMT_RENAME_TABLE)) {
+ result = parseRenameTable(tokens, parentNode);
+ } else if (tokens.matches(STMT_RENAME_INDEX)) {
+ result = parseRenameIndex(tokens, parentNode);
+ } else if (tokens.matches(STMT_DECLARE_GLOBAL_TEMP_TABLE)) {
+ result = parseDeclareGlobalTempTable(tokens, parentNode);
}
}
return result;
@@ -156,31 +119,22 @@
assert tokens != null;
assert parentNode != null;
- if (tokens.matches(DerbyStatementStartPhrases.STMT_CREATE_INDEX)
- || tokens.matches(DerbyStatementStartPhrases.STMT_CREATE_UNIQUE_INDEX)) {
+ if (tokens.matches(STMT_CREATE_INDEX)
+ || tokens.matches(STMT_CREATE_UNIQUE_INDEX)) {
return parseCreateIndex(tokens, parentNode);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_CREATE_FUNCTION)) {
+ } else if (tokens.matches(STMT_CREATE_FUNCTION)) {
+ return parseCreateFunction(tokens, parentNode);
+ } else if (tokens.matches(STMT_CREATE_PROCEDURE)) {
return parseStatement(tokens,
- DerbyStatementStartPhrases.STMT_CREATE_FUNCTION,
+ STMT_CREATE_PROCEDURE,
parentNode,
- TYPE_CREATE_FUNCTION_STATEMENT);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_CREATE_PROCEDURE)) {
- return parseStatement(tokens,
- DerbyStatementStartPhrases.STMT_CREATE_PROCEDURE,
- parentNode,
TYPE_CREATE_PROCEDURE_STATEMENT);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_CREATE_ROLE)) {
- return parseStatement(tokens, DerbyStatementStartPhrases.STMT_CREATE_ROLE, parentNode, TYPE_CREATE_ROLE_STATEMENT);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_CREATE_SYNONYM)) {
- return parseStatement(tokens,
- DerbyStatementStartPhrases.STMT_CREATE_SYNONYM,
- parentNode,
- TYPE_CREATE_SYNONYM_STATEMENT);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_CREATE_TRIGGER)) {
- return parseStatement(tokens,
- DerbyStatementStartPhrases.STMT_CREATE_TRIGGER,
- parentNode,
- TYPE_CREATE_TRIGGER_STATEMENT);
+ } else if (tokens.matches(STMT_CREATE_ROLE)) {
+ return parseStatement(tokens, STMT_CREATE_ROLE, parentNode, TYPE_CREATE_ROLE_STATEMENT);
+ } else if (tokens.matches(STMT_CREATE_SYNONYM)) {
+ return parseCreateSynonym(tokens, parentNode);
+ } else if (tokens.matches(STMT_CREATE_TRIGGER)) {
+ return parseCreateTrigger(tokens, parentNode);
}
return super.parseCreateStatement(tokens, parentNode);
@@ -217,14 +171,190 @@
indexNode.setProperty(UNIQUE_INDEX, isUnique);
indexNode.setProperty(TABLE_NAME, tableName);
+ parseIndexTableColumns(tokens, indexNode);
+
parseUntilTerminator(tokens);
markEndOfStatement(tokens, indexNode);
return indexNode;
}
+
+ private void parseIndexTableColumns(DdlTokenStream tokens, AstNode indexNode) throws ParsingException {
+ assert tokens != null;
+ assert indexNode != null;
+ // Assume we start with open parenthesis '(', then we parse comma separated list of column names followed by optional
+ // ASC or DESC
+
+
+ tokens.consume(L_PAREN); // EXPECTED
+
+ while (!tokens.canConsume(R_PAREN)) {
+ String colName = parseName(tokens);
+ AstNode colRefNode = nodeFactory().node(colName, indexNode, TYPE_INDEX_COLUMN_REFERENCE);
+ if( tokens.canConsume("ASC")) {
+ colRefNode.setProperty(ORDER, "ASC");
+ } else if( tokens.canConsume("DESC")) {
+ colRefNode.setProperty(ORDER, "DESC");
+ }
+ tokens.canConsume(COMMA);
+ }
+ }
+
/**
+ * Parses DDL CREATE FUNCTION statement
+ *
+ * @param tokens the tokenized {@link DdlTokenStream} of the DDL input content; may not be null
+ * @param parentNode the parent {@link AstNode} node; may not be null
+ * @return the parsed CREATE FUNCTION statement node
+ * @throws ParsingException
+ */
+ protected AstNode parseCreateFunction( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+
+ markStartOfStatement(tokens);
+ // CREATE FUNCTION function-name ( [ FunctionParameter [, FunctionParameter] ] * )
+ // RETURNS ReturnDataType [ FunctionElement ] *
+
+ //FunctionElement
+ // {
+ // | LANGUAGE { JAVA }
+ // | {DETERMINISTIC | NOT DETERMINISTIC}
+ // | EXTERNAL NAME string
+ // | PARAMETER STYLE {JAVA | DERBY_JDBC_RESULT_SET}
+ // | { NO SQL | CONTAINS SQL | READS SQL DATA }
+ // | { RETURNS NULL ON NULL INPUT | CALLED ON NULL INPUT }
+ // }
+ tokens.consume(CREATE, "FUNCTION"); // CREATE
+
+ String functionName = parseName(tokens);
+
+ AstNode functionNode = nodeFactory().node(functionName, parentNode, TYPE_CREATE_FUNCTION_STATEMENT);
+
+ parseFunctionParameters(tokens, functionNode);
+
+ tokens.consume("RETURNS");
+
+ if( tokens.canConsume("TABLE")) {
+ AstNode tableNode = nodeFactory().node("TABLE", functionNode, TYPE_CREATE_TABLE_STATEMENT);
+ parseColumnsAndConstraints(tokens, tableNode);
+ tableNode.setProperty(IS_TABLE_TYPE, true);
+ } else {
+ // Assume DataType
+ DataType datatype = getDatatypeParser().parse(tokens);
+ if( datatype != null) {
+ getDatatypeParser().setPropertiesOnNode(functionNode, datatype);
+ } else {
+ String msg = DdlSequencerI18n.missingReturnTypeForFunction.text(functionName);
+ DdlParserProblem problem = new DdlParserProblem(Problems.WARNING, getCurrentMarkedPosition(), msg);
+ addProblem(problem, functionNode);
+ }
+ }
+
+ while( !isTerminator(tokens)) {
+ if( tokens.matches("LANGUAGE")) {
+ AstNode optionNode = nodeFactory().node("language", functionNode, TYPE_STATEMENT_OPTION);
+ if( tokens.canConsume("LANGUAGE", "JAVA")) {
+ optionNode.setProperty(VALUE, "LANGUAGE JAVA");
+ } else {
+ tokens.consume("LANGUAGE");
+ optionNode.setProperty(VALUE, "LANGUAGE");
+ }
+ } else if( tokens.canConsume("DETERMINISTIC")) {
+ AstNode optionNode = nodeFactory().node("deterministic", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "DETERMINISTIC");
+ } else if( tokens.canConsume("NOT", "DETERMINISTIC")) {
+ AstNode optionNode = nodeFactory().node("deterministic", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "NOT DETERMINISTIC");
+ } else if( tokens.canConsume("EXTERNAL", "NAME")) {
+ String extName = parseName(tokens);
+ AstNode optionNode = nodeFactory().node("externalName", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "EXTERNAL NAME" + SPACE + extName);
+ } else if( tokens.canConsume("PARAMETER", "STYLE")) {
+ AstNode optionNode = nodeFactory().node("parameterStyle", functionNode, TYPE_STATEMENT_OPTION);
+ if( tokens.canConsume("JAVA")) {
+ optionNode.setProperty(VALUE, "PARAMETER STYLE" + SPACE + "JAVA");
+ } else {
+ tokens.consume("DERBY_JDBC_RESULT_SET");
+ optionNode.setProperty(VALUE, "PARAMETER STYLE" + SPACE + "DERBY_JDBC_RESULT_SET");
+ }
+ } else if( tokens.canConsume("NO", "SQL")) {
+ AstNode optionNode = nodeFactory().node("sqlStatus", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "NO SQL");
+ } else if( tokens.canConsume("CONTAINS", "SQL")) {
+ AstNode optionNode = nodeFactory().node("sqlStatus", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "CONTAINS SQL");
+ } else if( tokens.canConsume("READS", "SQL", "DATA")) {
+ AstNode optionNode = nodeFactory().node("sqlStatus", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "READS SQL DATA");
+ } else if( tokens.canConsume("RETURNS", "NULL", "ON", "NULL", "INPUT")) {
+ AstNode optionNode = nodeFactory().node("nullInput", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "RETURNS NULL ON NULL INPUT");
+ } else if( tokens.canConsume("CALLED", "ON", "NULL", "INPUT")) {
+ AstNode optionNode = nodeFactory().node("nullInput", functionNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "CALLED ON NULL INPUT");
+ }
+ }
+
+ markEndOfStatement(tokens, functionNode);
+
+ return functionNode;
+ }
+
+ private void parseFunctionParameters( DdlTokenStream tokens, AstNode functionNode) throws ParsingException {
+ assert tokens != null;
+ assert functionNode != null;
+
+ // Assume we start with open parenthesis '(', then we parse comma separated list of function parameters
+ // which have the form: [ parameter-Name ] DataType
+ // So, try getting datatype, if datatype == NULL, then parseName() & parse datatype, then repeat as long as next token is ","
+
+ tokens.consume(L_PAREN); // EXPECTED
+
+ while (!tokens.canConsume(R_PAREN)) {
+ DataType datatype = getDatatypeParser().parse(tokens);
+ if( datatype == null ) {
+ String paramName = parseName(tokens);
+ datatype = getDatatypeParser().parse(tokens);
+ AstNode paramNode = nodeFactory().node(paramName, functionNode, TYPE_FUNCTION_PARAMETER);
+ getDatatypeParser().setPropertiesOnNode(paramNode, datatype);
+ } else {
+ AstNode paramNode = nodeFactory().node("functionParameter", functionNode, TYPE_FUNCTION_PARAMETER);
+ getDatatypeParser().setPropertiesOnNode(paramNode, datatype);
+ }
+ tokens.canConsume(COMMA);
+ }
+ }
+
+ /**
+ * Parses DDL CREATE FUNCTION statement
+ *
+ * @param tokens the tokenized {@link DdlTokenStream} of the DDL input content; may not be null
+ * @param parentNode the parent {@link AstNode} node; may not be null
+ * @return the parsed CREATE FUNCTION statement node
+ * @throws ParsingException
+ */
+ protected AstNode parseCreateProcedure( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+
+ markStartOfStatement(tokens);
+
+ tokens.consume(CREATE, "PROCEDURE"); // CREATE
+
+ String functionName = parseName(tokens);
+
+ AstNode functionNode = nodeFactory().node(functionName, parentNode, TYPE_CREATE_FUNCTION_STATEMENT);
+
+ markEndOfStatement(tokens, functionNode);
+
+ return functionNode;
+ }
+ /**
* {@inheritDoc}
*
* @see org.jboss.dna.sequencer.ddl.StandardDdlParser#parseDropStatement(org.jboss.dna.sequencer.ddl.DdlTokenStream,
@@ -240,53 +370,34 @@
String name = null;
- if (tokens.matches(DerbyStatementStartPhrases.STMT_DROP_FUNCTION)) {
+ if (tokens.matches(STMT_DROP_FUNCTION)) {
markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_DROP_FUNCTION);
+ tokens.consume(STMT_DROP_FUNCTION);
name = parseName(tokens);
dropNode = nodeFactory().node(name, parentNode, TYPE_DROP_FUNCTION_STATEMENT);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_DROP_INDEX)) {
+ } else if (tokens.matches(STMT_DROP_INDEX)) {
markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_DROP_INDEX);
+ tokens.consume(STMT_DROP_INDEX);
name = parseName(tokens);
dropNode = nodeFactory().node(name, parentNode, TYPE_DROP_INDEX_STATEMENT);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_DROP_PROCEDURE)) {
+ } else if (tokens.matches(STMT_DROP_PROCEDURE)) {
markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_DROP_PROCEDURE);
+ tokens.consume(STMT_DROP_PROCEDURE);
name = parseName(tokens);
dropNode = nodeFactory().node(name, parentNode, TYPE_DROP_PROCEDURE_STATEMENT);
-
- // CREATE PROCEDURE procedure-Name ( [ ProcedureParameter [, ProcedureParameter] ] * ) [ ProcedureElement ] *
- // ProcedureParameter:
- // [ { IN | OUT | INOUT } ] [ parameter-Name ] DataType
- // ProcedureElement:
- //
- // {
- // | [ DYNAMIC ] RESULT SETS INTEGER
- // | LANGUAGE { JAVA }
- // | DeterministicCharacteristic
- // | EXTERNAL NAME string
- // | PARAMETER STYLE JAVA
- // | { NO SQL | MODIFIES SQL DATA | CONTAINS SQL | READS SQL DATA }
- // }
-
- // TODO: BARRY
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_DROP_ROLE)) {
+ } else if (tokens.matches(STMT_DROP_ROLE)) {
markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_DROP_ROLE);
+ tokens.consume(STMT_DROP_ROLE);
name = parseName(tokens);
dropNode = nodeFactory().node(name, parentNode, TYPE_DROP_ROLE_STATEMENT);
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_DROP_SYNONYM)) {
+ } else if (tokens.matches(STMT_DROP_SYNONYM)) {
markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_DROP_SYNONYM);
+ tokens.consume(STMT_DROP_SYNONYM);
name = parseName(tokens);
dropNode = nodeFactory().node(name, parentNode, TYPE_DROP_SYNONYM_STATEMENT);
- // CREATE SYNONYM synonym-Name FOR { view-Name | table-Name }
-
- // TODO: BARRY
- } else if (tokens.matches(DerbyStatementStartPhrases.STMT_DROP_TRIGGER)) {
+ } else if (tokens.matches(STMT_DROP_TRIGGER)) {
markStartOfStatement(tokens);
- tokens.consume(DerbyStatementStartPhrases.STMT_DROP_TRIGGER);
+ tokens.consume(STMT_DROP_TRIGGER);
name = parseName(tokens);
dropNode = nodeFactory().node(name, parentNode, TYPE_DROP_TRIGGER_STATEMENT);
}
@@ -317,73 +428,152 @@
AstNode parentNode ) throws ParsingException {
assert tokens != null;
assert parentNode != null;
+ assert tokens.matches(GRANT);
+
+ markStartOfStatement(tokens);
- return super.parseGrantStatement(tokens, parentNode);
- // Statement stmt = null;
- //
- // if( tokens.matches(GRANT, DdlTokenStream.ANY_VALUE, "TO")) {
- // stmt = new TypedStatement();
- // consume(tokens, stmt, false, GRANT);
- // String privilege = consume(tokens, stmt, true);
- // tokens.consume("TO");
- // String toValue = consume(tokens, stmt, true);
+ // Syntax for tables
//
- // String value = parseUntilTerminator(tokens);
- // stmt.appendSource(true, value);
- // stmt.setType("GRANT" + SPACE + privilege + SPACE + "TO" + SPACE + toValue);
- // consumeTerminator(tokens);
- // return stmt;
- // } else if( tokens.matches(GRANT, DdlTokenStream.ANY_VALUE, "ON")) {
- // stmt = new TypedStatement();
- // consume(tokens, stmt, false, GRANT);
- // String privilege = consume(tokens, stmt, true);
- // tokens.consume("ON");
- //
- // tokens.canConsume("TABLE");
- //
- // String onValue = tokens.consume();
- //
- // String value = parseUntilTerminator(tokens);
- // stmt.appendSource(true, value);
- // stmt.setType("GRANT" + SPACE + privilege + SPACE + "ON" + SPACE + onValue);
- // consumeTerminator(tokens);
- // return stmt;
- // } else if( tokens.matches(GRANT, "ALL", "PRIVILEGES", "ON")) {
- // stmt = new TypedStatement();
- // consume(tokens, stmt, false, GRANT);
- // String privilege = consume(tokens, stmt, true,"ALL", "PRIVILEGES", "ON");
- // tokens.canConsume("TABLE");
- //
- // String onValue = tokens.consume();
- //
- // String value = parseUntilTerminator(tokens);
- // stmt.appendSource(true, value);
- // stmt.setType("GRANT" + SPACE + privilege + SPACE + "ON" + SPACE + onValue);
- // consumeTerminator(tokens);
- // return stmt;
- // } else if( tokens.matches(GRANT, "SELECT") ||
- // tokens.matches(GRANT, "UPDATE") ||
- // tokens.matches(GRANT, "DELETE") ||
- // tokens.matches(GRANT, "INSERT") ||
- // tokens.matches(GRANT, "TRIGGER") ||
- // tokens.matches(GRANT, "REFERENCES")) {
- // stmt = new TypedStatement();
- // consume(tokens, stmt, false, GRANT);
- //
- // String nextTok = consume(tokens, stmt, true) + SPACE + consume(tokens, stmt, true) + SPACE + consume(tokens, stmt,
- // true);
+ // GRANT privilege-type ON [TABLE] { table-Name | view-Name } TO grantees
//
- // String value = parseUntilTerminator(tokens);
- // stmt.appendSource(true, value);
- // stmt.setType("GRANT" + SPACE + nextTok);
- // consumeTerminator(tokens);
- // return stmt;
- // }
- //
- //
- // return null;
+ // Syntax for routines
+ //
+ // GRANT EXECUTE ON { FUNCTION | PROCEDURE } {function-name | procedure-name} TO grantees
+ //
+ // Syntax for roles
+ //
+ // GRANT roleName [ {, roleName }* ] TO grantees
+
+ // privilege-types
+ //
+ // ALL PRIVILEGES | privilege-list
+ //
+ AstNode grantNode = null;
+ boolean allPrivileges = false;
+
+ List<AstNode> privileges = new ArrayList<AstNode>();
+
+ tokens.consume("GRANT");
+ if(tokens.canConsume("EXECUTE", "ON")) {
+ AstNode node = nodeFactory().node("privilege");
+ nodeFactory().setType(node, GRANT_PRIVILEGE);
+ node.setProperty(TYPE, "EXECUTE");
+ privileges = new ArrayList<AstNode>();
+ privileges.add(node);
+ if( tokens.canConsume("FUNCTION") ) {
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_FUNCTION_STATEMENT);
+ } else {
+ tokens.consume("PROCEDURE");
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_PROCEDURE_STATEMENT);
+ }
+ } else {
+
+
+ if( tokens.canConsume("ALL", "PRIVILEGES")) {
+ allPrivileges = true;
+ } else {
+ parseGrantPrivileges(tokens, privileges);
+
+ if( privileges.isEmpty() ) {
+ // ASSUME: GRANT roleName [ {, roleName }* ] TO grantees
+ grantNode = nodeFactory().node("grantRoles", parentNode, TYPE_GRANT_ROLES_STATEMENT);
+ do {
+ String roleName = parseName(tokens);
+ nodeFactory().node(roleName, grantNode, ROLE_NAME);
+ } while( tokens.canConsume(COMMA));
+ }
+ }
+ if( grantNode == null ) {
+ tokens.consume("ON");
+ tokens.canConsume(TABLE); // OPTIONAL
+ String name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TABLE_STATEMENT);
+ // Attach privileges to grant node
+ for( AstNode node : privileges ) {
+ node.setParent(grantNode);
+ }
+ if( allPrivileges ) {
+ grantNode.setProperty(ALL_PRIVILEGES, allPrivileges);
+ }
+ }
+
+ }
+
+ tokens.consume("TO");
+
+ do {
+ String grantee = parseName(tokens);
+ nodeFactory().node(grantee, grantNode, GRANTEE);
+ } while( tokens.canConsume(COMMA));
+
+ markEndOfStatement(tokens, grantNode);
+
+ return grantNode;
}
+
+ /**
+ *
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.sequencer.ddl.StandardDdlParser#parseGrantPrivileges(org.jboss.dna.sequencer.ddl.DdlTokenStream, java.util.List)
+ */
+ @Override
+ protected void parseGrantPrivileges( DdlTokenStream tokens, List<AstNode> privileges) throws ParsingException {
+ // privilege-types
+ //
+ // ALL PRIVILEGES | privilege-list
+ //
+ // privilege-list
+ //
+ // table-privilege {, table-privilege }*
+ //
+ // table-privilege
+ // DELETE |
+ // INSERT |
+ // REFERENCES [column list] |
+ // SELECT [column list] |
+ // TRIGGER |
+ // UPDATE [column list]
+ // column list
+ // ( column-identifier {, column-identifier}* )
+
+ do {
+ AstNode node = null;
+
+ if( tokens.canConsume(DELETE)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, DELETE);
+ } else if( tokens.canConsume(INSERT)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, INSERT);
+ } else if( tokens.canConsume("REFERENCES")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "REFERENCES");
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ } else if( tokens.canConsume(SELECT)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, SELECT);
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ } else if( tokens.canConsume("TRIGGER")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "TRIGGER");
+ } else if( tokens.canConsume(UPDATE)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, UPDATE);
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ }
+ if( node == null) {
+ break;
+ }
+ nodeFactory().setType(node, GRANT_PRIVILEGE);
+ privileges.add(node);
+
+ } while( tokens.canConsume(COMMA));
+ }
+
/**
* {@inheritDoc}
*
@@ -668,7 +858,324 @@
return false;
}
+
+ private AstNode parseDeclareGlobalTempTable( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+ markStartOfStatement(tokens);
+
+ // DECLARE GLOBAL TEMPORARY TABLE table-Name
+ // { column-definition [ , column-definition ] * }
+ // [ ON COMMIT {DELETE | PRESERVE} ROWS ]
+ // NOT LOGGED [ON ROLLBACK DELETE ROWS]
+
+ tokens.consume(STMT_DECLARE_GLOBAL_TEMP_TABLE);
+ String name = parseName(tokens);
+
+ AstNode node = nodeFactory().node(name, parentNode, TYPE_DECLARE_GLOBAL_TEMPORARY_TABLE_STATEMENT);
+
+ parseColumnsAndConstraints(tokens, node);
+
+ if( tokens.canConsume("ON", "COMMIT")) {
+ AstNode optionNode = nodeFactory().node("onCommit", node, TYPE_STATEMENT_OPTION);
+ if( tokens.canConsume("DELETE", "ROWS")) {
+ optionNode.setProperty(VALUE, "ON COMMIT DELETE ROWS");
+ } else {
+ tokens.consume("PRESERVE", "ROWS");
+ optionNode.setProperty(VALUE, "ON COMMIT PRESERVE ROWS");
+ }
+ }
+ tokens.consume("NOT", "LOGGED");
+
+ if( tokens.canConsume("ON", "ROLLBACK", "DELETE", "ROWS")) {
+ AstNode optionNode = nodeFactory().node("onRollback", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "ON ROLLBACK DELETE ROWS");
+ }
+
+ markEndOfStatement(tokens, node);
+
+ return node;
+ }
+
+ private AstNode parseLockTable( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+
+ markStartOfStatement(tokens);
+
+ // LOCK TABLE table-Name IN { SHARE | EXCLUSIVE } MODE;
+
+ tokens.consume(STMT_LOCK_TABLE);
+
+ String name = parseName(tokens);
+
+ AstNode node = nodeFactory().node(name, parentNode, TYPE_LOCK_TABLE_STATEMENT);
+
+ tokens.consume("IN");
+
+ if( tokens.canConsume("SHARE")) {
+ AstNode propNode = nodeFactory().node("lockMode", node, TYPE_STATEMENT_OPTION);
+ propNode.setProperty(VALUE, "SHARE");
+ } else {
+ tokens.consume("EXCLUSIVE");
+ AstNode propNode = nodeFactory().node("lockMode", node, TYPE_STATEMENT_OPTION);
+ propNode.setProperty(VALUE, "EXCLUSIVE");
+ }
+ tokens.consume("MODE");
+
+ markEndOfStatement(tokens, node);
+
+ return node;
+ }
+
+ private AstNode parseRenameTable( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+
+ markStartOfStatement(tokens);
+
+ // RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT;
+
+ tokens.consume(STMT_RENAME_TABLE);
+
+ String oldName = parseName(tokens);
+
+ AstNode node = nodeFactory().node(oldName, parentNode, TYPE_RENAME_TABLE_STATEMENT);
+
+ tokens.consume("TO");
+
+ String newName = parseName(tokens);
+
+ node.setProperty(NEW_NAME, newName);
+
+ markEndOfStatement(tokens, node);
+
+ return node;
+ }
+
+ private AstNode parseRenameIndex( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+
+ markStartOfStatement(tokens);
+
+ // RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT;
+
+ tokens.consume(STMT_RENAME_INDEX);
+
+ String oldName = parseName(tokens);
+
+ AstNode node = nodeFactory().node(oldName, parentNode, TYPE_RENAME_INDEX_STATEMENT);
+
+ tokens.consume("TO");
+
+ String newName = parseName(tokens);
+
+ node.setProperty(NEW_NAME, newName);
+
+ markEndOfStatement(tokens, node);
+
+ return node;
+ }
+
+ private AstNode parseCreateSynonym( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+
+ markStartOfStatement(tokens);
+ //CREATE SYNONYM synonym-Name FOR { view-Name | table-Name }
+
+ tokens.consume(STMT_CREATE_SYNONYM);
+
+ String name = parseName(tokens);
+
+ AstNode node = nodeFactory().node(name, parentNode, TYPE_CREATE_SYNONYM_STATEMENT);
+
+ tokens.consume("FOR");
+
+ String tableOrViewName = parseName(tokens);
+
+ node.setProperty(TABLE_NAME, tableOrViewName);
+
+ markEndOfStatement(tokens, node);
+
+ return node;
+ }
+
+ private AstNode parseCreateTrigger( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ assert tokens != null;
+ assert parentNode != null;
+
+ markStartOfStatement(tokens);
+ // CREATE TRIGGER TriggerName
+ // { AFTER | NO CASCADE BEFORE }
+ // { INSERT | DELETE | UPDATE [ OF column-Name [, column-Name]* ] }
+ // ON table-Name
+ // [ ReferencingClause ]
+ // [ FOR EACH { ROW | STATEMENT } ] [ MODE DB2SQL ]
+ // Triggered-SQL-statement
+
+ // ReferencingClause
+ // REFERENCING
+ // {
+ // { OLD | NEW } [ ROW ] [ AS ] correlation-Name [ { OLD | NEW } [ ROW ] [ AS ] correlation-Name ] |
+ // { OLD TABLE | NEW TABLE } [ AS ] Identifier [ { OLD TABLE | NEW TABLE } [AS] Identifier ] |
+ // { OLD_TABLE | NEW_TABLE } [ AS ] Identifier [ { OLD_TABLE | NEW_TABLE } [AS] Identifier ]
+ // }
+
+
+ // EXAMPLE:
+ // CREATE TRIGGER t1 NO CASCADE BEFORE UPDATE ON x
+ // FOR EACH ROW MODE DB2SQL
+ // values app.notifyEmail('Jerry', 'Table x is about to be updated');
+
+ tokens.consume(STMT_CREATE_TRIGGER);
+
+ String name = parseName(tokens);
+
+ AstNode node = nodeFactory().node(name, parentNode, TYPE_CREATE_TRIGGER_STATEMENT);
+
+ String type = null;
+
+ if( tokens.canConsume("AFTER") ) {
+ AstNode optionNode = nodeFactory().node("beforeOrAfter", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "AFTER");
+ } else {
+ tokens.consume("NO", "CASCADE", "BEFORE");
+ AstNode optionNode = nodeFactory().node("beforeOrAfter", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "NO CASCADE BEFORE");
+ }
+
+ if( tokens.canConsume(INSERT)) {
+ AstNode optionNode = nodeFactory().node("eventType", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, INSERT);
+ type = INSERT;
+ } else if( tokens.canConsume(DELETE) ) {
+ AstNode optionNode = nodeFactory().node("eventType", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, DELETE);
+ type = DELETE;
+ } else {
+ tokens.consume(UPDATE);
+ AstNode optionNode = nodeFactory().node("eventType", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, UPDATE);
+ type = UPDATE;
+ }
+
+ if( tokens.canConsume("OF") ) {
+ // Parse comma separated column names
+ String colName = parseName(tokens);
+ nodeFactory().node(colName, node, TYPE_COLUMN_REFERENCE);
+
+ while( tokens.canConsume(COMMA)) {
+ colName = parseName(tokens);
+ nodeFactory().node(colName, node, TYPE_COLUMN_REFERENCE);
+ }
+ }
+ tokens.consume("ON");
+
+ String tableName = parseName(tokens);
+
+ node.setProperty(TABLE_NAME, tableName);
+
+ if( tokens.canConsume("REFERENCING") ) {
+ // ReferencingClause
+ // REFERENCING
+ // {
+ // { OLD | NEW } [ ROW ] [ AS ] correlation-Name [ { OLD | NEW } [ ROW ] [ AS ] correlation-Name ] |
+ // { OLD TABLE | NEW TABLE } [ AS ] Identifier [ { OLD TABLE | NEW TABLE } [AS] Identifier ] |
+ // { OLD_TABLE | NEW_TABLE } [ AS ] Identifier [ { OLD_TABLE | NEW_TABLE } [AS] Identifier ]
+ // }
+
+ StringBuffer sb = new StringBuffer();
+ if( tokens.matchesAnyOf("OLD", "NEW") ) {
+ if( tokens.canConsume("OLD")) {
+ sb.append("OLD");
+ } else {
+ tokens.consume("NEW");
+ sb.append("NEW");
+ }
+ if( tokens.canConsume("ROW") ) {
+ sb.append(SPACE).append("ROW");
+ }
+ if( tokens.canConsume("AS") ) {
+ sb.append(SPACE).append("AS");
+ }
+ if( tokens.matchesAnyOf("OLD", "NEW")) {
+ if( tokens.canConsume("OLD")) {
+ sb.append(SPACE).append("OLD");
+ } else {
+ tokens.consume("NEW");
+ sb.append(SPACE).append("NEW");
+ }
+
+ if( tokens.canConsume("ROW") ) {
+ sb.append(SPACE).append("ROW");
+ }
+ if( tokens.canConsume("AS") ) {
+ sb.append(SPACE).append("AS");
+ }
+ if( ! tokens.matchesAnyOf("FOR", "MODE", type)) {
+ String corrName = parseName(tokens);
+ sb.append(SPACE).append(corrName);
+ }
+ } else {
+ String corrName = parseName(tokens);
+ sb.append(SPACE).append(corrName);
+
+ if( tokens.matchesAnyOf("OLD", "NEW") ) {
+ if( tokens.canConsume("OLD")) {
+ sb.append(SPACE).append("OLD");
+ } else {
+ tokens.consume("NEW");
+ sb.append(SPACE).append("NEW");
+ }
+
+ if( tokens.canConsume("ROW") ) {
+ sb.append(SPACE).append("ROW");
+ }
+ if( tokens.canConsume("AS") ) {
+ sb.append(SPACE).append("AS");
+ }
+ if( ! tokens.matchesAnyOf("FOR", "MODE", type)) {
+ corrName = parseName(tokens);
+ sb.append(SPACE).append(corrName);
+ }
+ }
+ }
+ }
+ }
+ //[ FOR EACH { ROW | STATEMENT } ] [ MODE DB2SQL ]
+ if( tokens.canConsume("FOR", "EACH")) {
+ if( tokens.canConsume("ROW")) {
+ AstNode optionNode = nodeFactory().node("forEach", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "FOR EACH ROW");
+ } else {
+ tokens.consume("STATEMENT");
+ AstNode optionNode = nodeFactory().node("forEach", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "FOR EACH STATEMENT");
+ }
+ }
+ if( tokens.canConsume("MODE")) {
+ tokens.consume("DB2SQL");
+ AstNode optionNode = nodeFactory().node("mode", node, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "MODE DB2SQL");
+ }
+
+ String sql = parseUntilTerminatorIgnoreEmbeddedStatements(tokens);
+ node.setProperty(SQL, sql);
+
+ markEndOfStatement(tokens, node);
+
+ return node;
+ }
+
/**
* {@inheritDoc}
*
@@ -796,23 +1303,28 @@
dataType.setKMGLength(isKMGLength);
dataType.setKMGValue(kmgValue);
} else if (tokens.matches(DerbyDataTypes.DTYPE_BIGINT)) {
+ dataType = new DataType();
typeName = consume(tokens, dataType, true);
- dataType = new DataType(typeName);
+ dataType.setName(typeName);
} else if (tokens.matches(DerbyDataTypes.DTYPE_LONG_VARCHAR_FBD)) {
+ dataType = new DataType();
typeName = consume(tokens, dataType, true) + SPACE + consume(tokens, dataType, true) + SPACE
+ consume(tokens, dataType, true) + SPACE + consume(tokens, dataType, true) + SPACE
+ consume(tokens, dataType, true);
- dataType = new DataType(typeName);
+ dataType.setName(typeName);
} else if (tokens.matches(DerbyDataTypes.DTYPE_LONG_VARCHAR)) {
+ dataType = new DataType();
typeName = consume(tokens, dataType, true) + SPACE + consume(tokens, dataType, true);
typeName = consume(tokens, dataType, true);
- dataType = new DataType(typeName);
+ dataType.setName(typeName);
} else if (tokens.matches(DerbyDataTypes.DTYPE_DOUBLE)) {
+ dataType = new DataType();
typeName = consume(tokens, dataType, true);
- dataType = new DataType(typeName);
+ dataType.setName(typeName);
} else if (tokens.matches(DerbyDataTypes.DTYPE_XML)) {
+ dataType = new DataType();
typeName = consume(tokens, dataType, true);
- dataType = new DataType(typeName);
+ dataType.setName(typeName);
}
if (dataType == null) {
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlParser.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -422,7 +422,27 @@
CheckArg.isNotNull(tokens, "tokens");
CheckArg.isNotNull(parentNode, "parentNode");
- // return super.parseGrantStatement(tokens, parentNode);
+ // GRANT { grant_system_privileges | grant_object_privileges } ;
+ //
+ // ** grant_system_privileges **
+ //
+ // { system_privilege | role | ALL PRIVILEGES } [, { system_privilege | role | ALL PRIVILEGES } ]...
+ // TO grantee_clause [ WITH ADMIN OPTION ]
+ //
+ // ** grant_object_privileges **
+ //
+ // { object_privilege | ALL [ PRIVILEGES ] } [ (column [, column ]...) ] [, { object_privilege | ALL [ PRIVILEGES ] } [ (column [, column ]...) ] ]...
+ // on_object_clause
+ // TO grantee_clause [ WITH HIERARCHY OPTION ] [ WITH GRANT OPTION ]
+
+ // ** on_object_clause **
+ //
+ // { [ schema. ] object | { DIRECTORY directory_name | JAVA { SOURCE | RESOURCE } [ schema. ] object } }
+ //
+ // ** grantee_clause **
+ //
+ // { user [ IDENTIFIED BY password ] | role | PUBLIC } [, { user [ IDENTIFIED BY password ] | role | PUBLIC } ]...
+
AstNode node = null;
// Original implementation does NOT parse Insert statement, but just returns a generic TypedStatement
@@ -444,8 +464,8 @@
markEndOfStatement(tokens, node);
- return node;
-
+ return node;
+
// if( tokens.matches(GRANT, DdlTokenStream.ANY_VALUE, "TO")) {
// markStartOfStatement(tokens);
// tokens.consume(GRANT);
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlConstants.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlConstants.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlConstants.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -243,10 +243,12 @@
public final static Name[] VALID_SCHEMA_CHILD_STMTS = {
StandardDdlLexicon.TYPE_CREATE_TABLE_STATEMENT,
StandardDdlLexicon.TYPE_CREATE_VIEW_STATEMENT,
- StandardDdlLexicon.TYPE_GRANT_STATEMENT,
+ StandardDdlLexicon.TYPE_GRANT_ON_TABLE_STATEMENT,
PostgresDdlLexicon.TYPE_CREATE_INDEX_STATEMENT,
PostgresDdlLexicon.TYPE_CREATE_SEQUENCE_STATEMENT,
- PostgresDdlLexicon.TYPE_CREATE_TRIGGER_STATEMENT
+ PostgresDdlLexicon.TYPE_CREATE_TRIGGER_STATEMENT,
+ PostgresDdlLexicon.TYPE_GRANT_ON_SEQUENCE_STATEMENT,
+ PostgresDdlLexicon.TYPE_GRANT_ON_SCHEMA_STATEMENT
};
public final static Name[] COMPLEX_STMT_TYPES = {
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -133,9 +133,23 @@
public static final Name TYPE_UNLISTEN_STATEMENT = new BasicName(Namespace.URI, "unlistenStatement");
public static final Name TYPE_VACUUM_STATEMENT = new BasicName(Namespace.URI, "vacuumStatement");
- public static final Name TYPE_RENAME_COLUMN = new BasicName(Namespace.URI, "renameColumn");
+ public static final Name TYPE_GRANT_ON_SEQUENCE_STATEMENT = new BasicName(Namespace.URI, "grantOnSequenceStatement");
+ public static final Name TYPE_GRANT_ON_DATABASE_STATEMENT = new BasicName(Namespace.URI, "grantOnDatabaseStatement");
+ public static final Name TYPE_GRANT_ON_FOREIGN_DATA_WRAPPER_STATEMENT = new BasicName(Namespace.URI, "grantOnForeignDataWrapperStatement");
+ public static final Name TYPE_GRANT_ON_FOREIGN_SERVER_STATEMENT = new BasicName(Namespace.URI, "grantOnForeignServerStatement");
+ public static final Name TYPE_GRANT_ON_FUNCTION_STATEMENT = new BasicName(Namespace.URI, "grantOnFunctionStatement");
+ public static final Name TYPE_GRANT_ON_LANGUAGE_STATEMENT = new BasicName(Namespace.URI, "grantOnLanguageStatement");
+ public static final Name TYPE_GRANT_ON_SCHEMA_STATEMENT = new BasicName(Namespace.URI, "grantOnSchemaStatement");
+ public static final Name TYPE_GRANT_ON_TABLESPACE_STATEMENT = new BasicName(Namespace.URI, "grantOnTablespaceStatement");
+ public static final Name TYPE_GRANT_ON_PROCEDURE_STATEMENT = new BasicName(Namespace.URI, "grantOnProcedureStatement");
+ public static final Name TYPE_GRANT_ROLES_STATEMENT = new BasicName(Namespace.URI, "grantRolesStatement");
+ public static final Name TYPE_RENAME_COLUMN = new BasicName(Namespace.URI, "renamedColumn");
+
public static final Name SCHEMA_NAME = new BasicName(Namespace.URI, "schemaName");
+ public static final Name FUNCTION_PARAMETER = new BasicName(Namespace.URI, "functionParameter");
+ public static final Name FUNCTION_PARAMETER_MODE = new BasicName(Namespace.URI, "mode");
+ public static final Name ROLE = new BasicName(Namespace.URI, "role");
// PROPERTY NAMES
public static final Name TARGET_OBJECT_TYPE = new BasicName(Namespace.URI, "targetObjectType");
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParser.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParser.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParser.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -1,118 +1,11 @@
package org.jboss.dna.sequencer.ddl.dialect.postgres;
import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_EXPRESSION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_ORIGINAL_EXPRESSION;
import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_START_CHAR_INDEX;
import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_START_COLUMN_NUMBER;
import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DDL_START_LINE_NUMBER;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DEFAULT_OPTION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DEFAULT_PRECISION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DEFAULT_VALUE;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.DROP_BEHAVIOR;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.NEW_NAME;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_ALTER_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_DOMAIN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_SCHEMA_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_TABLE_CONSTRAINT_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_VIEW_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_MISSING_TERMINATOR;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_STATEMENT_OPTION;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_UNKNOWN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.VALUE;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.SCHEMA_NAME;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ABORT_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_AGGREGATE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_CONVERSION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_DATABASE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_FOREIGN_DATA_WRAPPER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_FUNCTION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_GROUP_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_INDEX_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_LANGUAGE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_OPERATOR_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_ROLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_SCHEMA_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_SEQUENCE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_SERVER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_TABLESPACE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_TABLE_STATEMENT_POSTGRES;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_TEXT_SEARCH_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_TRIGGER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_TYPE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_USER_MAPPING_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_USER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_VIEW_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ANALYZE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CLUSTER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_COMMENT_ON_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_COPY_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_AGGREGATE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_CAST_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_CONSTRAINT_TRIGGER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_CONVERSION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_DATABASE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_FOREIGN_DATA_WRAPPER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_FUNCTION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_GROUP_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_INDEX_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_LANGUAGE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_OPERATOR_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_ROLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_RULE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_SEQUENCE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_SERVER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_TABLESPACE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_TEXT_SEARCH_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_TRIGGER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_TYPE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_USER_MAPPING_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_USER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DEALLOCATE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DECLARE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_AGGREGATE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_CAST_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_CONSTRAINT_TRIGGER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_CONVERSION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_DATABASE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_FOREIGN_DATA_WRAPPER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_FUNCTION_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_GROUP_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_INDEX_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_LANGUAGE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_OPERATOR_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_OWNED_BY_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_ROLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_RULE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_SEQUENCE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_SERVER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_TABLESPACE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_TEXT_SEARCH_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_TRIGGER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_TYPE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_USER_MAPPING_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_DROP_USER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_EXPLAIN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_FETCH_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_LISTEN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_LOAD_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_LOCK_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_MOVE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_NOTIFY_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_PREPARE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_REASSIGN_OWNED_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_REINDEX_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_RELEASE_SAVEPOINT_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_RENAME_COLUMN;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ROLLBACK_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_SELECT_INTO_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_SHOW_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_TRUNCATE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_UNLISTEN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_VACUUM_STATEMENT;
+import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.*;
+import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.*;
import java.util.ArrayList;
import java.util.List;
import org.jboss.dna.common.text.ParsingException;
@@ -1138,17 +1031,360 @@
return newNode;
}
-
+
+ /**
+ *
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.sequencer.ddl.StandardDdlParser#parseGrantStatement(org.jboss.dna.sequencer.ddl.DdlTokenStream, org.jboss.dna.sequencer.ddl.node.AstNode)
+ */
@Override
protected AstNode parseGrantStatement( DdlTokenStream tokens,
AstNode parentNode ) throws ParsingException {
assert tokens != null;
assert parentNode != null;
+ assert tokens.matches(GRANT);
+
+ markStartOfStatement(tokens);
- return super.parseGrantStatement(tokens, parentNode);
+ // NOTE: The first wack at this does not take into account the apparent potential repeating name elements after each type
+ // declaration. Example:
+ // GRANT { { SELECT | INSERT | UPDATE | DELETE | TRUNCATE | REFERENCES | TRIGGER }
+ // [,...] | ALL [ PRIVILEGES ] }
+ // ON [ TABLE ] tablename [, ...]
+ // TO { [ GROUP ] rolename | PUBLIC } [, ...] [ WITH GRANT OPTION ]
+ //
+ // the "ON [ TABLE ] tablename [, ...]" seems to indicate that you can grant privileges on multiple tables at once, which is
+ // different thatn the SQL 92 standard. So this pass ONLY allows one and an parsing error will probably occur if multiple.
+ //
+ // Syntax for tables
+ //
+ // GRANT <privileges> ON <object name>
+ // TO <grantee> [ { <comma> <grantee> }... ]
+ // [ WITH GRANT OPTION ]
+ //
+ // <object name> ::=
+ // [ TABLE ] <table name>
+ // | SEQUENCE <sequence name>
+ // | DATABASE <db name>
+ // | FOREIGN DATA WRAPPER <fdw name>
+ // | FOREIGN SERVER <server name>
+ // | FUNCTION <function name>
+ // | LANGUAGE <language name>
+ // | SCHEMA <schema name>
+ // | TABLESPACE <tablespace name>
+
+ //
+ // Syntax for roles
+ //
+ // GRANT roleName [ {, roleName }* ] TO grantees
+
+ // privilege-types
+ //
+ // ALL PRIVILEGES | privilege-list
+ //
+ List<AstNode> grantNodes = new ArrayList<AstNode>();
+ boolean allPrivileges = false;
+
+ List<AstNode> privileges = new ArrayList<AstNode>();
+
+ tokens.consume("GRANT");
+
+ if( tokens.canConsume("ALL", "PRIVILEGES")) {
+ allPrivileges = true;
+ } else {
+ parseGrantPrivileges(tokens, privileges);
+ }
+
+ if( allPrivileges || !privileges.isEmpty() ) {
+
+ tokens.consume("ON");
+
+ if( tokens.canConsume("SCHEMA")) {
+ grantNodes = parseMultipleGrantTargets(tokens, parentNode, TYPE_GRANT_ON_SCHEMA_STATEMENT);
+ } else if( tokens.canConsume("SEQUENCE") ) {
+ grantNodes = parseMultipleGrantTargets(tokens, parentNode, TYPE_GRANT_ON_SEQUENCE_STATEMENT);
+ } else if( tokens.canConsume("TABLESPACE") ) {
+ grantNodes = parseMultipleGrantTargets(tokens, parentNode, TYPE_GRANT_ON_TABLESPACE_STATEMENT);
+ } else if( tokens.canConsume("DATABASE")) {
+ grantNodes = parseMultipleGrantTargets(tokens, parentNode, TYPE_GRANT_ON_DATABASE_STATEMENT);
+ } else if( tokens.canConsume("FUNCTION")) {
+ grantNodes = parseFunctionAndParameters(tokens, parentNode);
+ } else if( tokens.canConsume("LANGUAGE")) {
+ grantNodes = parseMultipleGrantTargets(tokens, parentNode, TYPE_GRANT_ON_LANGUAGE_STATEMENT);
+ } else if( tokens.canConsume("FOREIGN", "DATA", "WRAPPER")) {
+ grantNodes = parseMultipleGrantTargets(tokens, parentNode, TYPE_GRANT_ON_FOREIGN_DATA_WRAPPER_STATEMENT);
+ } else if( tokens.canConsume("FOREIGN", "SERVER")) {
+ grantNodes = parseMultipleGrantTargets(tokens, parentNode, TYPE_GRANT_ON_FOREIGN_SERVER_STATEMENT);
+ } else {
+ tokens.canConsume(TABLE); // OPTIONAL
+ String name = parseName(tokens);
+ AstNode grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TABLE_STATEMENT);
+ grantNodes.add(grantNode);
+ while( tokens.canConsume(COMMA) ) {
+ // Assume more names here
+ name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_TABLE_STATEMENT);
+ grantNodes.add(grantNode);
+ }
+ }
+ } else {
+ // Assume ROLES here
+ // role [, ...]
+ AstNode grantNode = nodeFactory().node("roles", parentNode, TYPE_GRANT_ROLES_STATEMENT);
+ grantNodes.add(grantNode);
+ do {
+ String role = parseName(tokens);
+ nodeFactory().node(role, grantNode, ROLE);
+ } while( tokens.canConsume(COMMA));
+ }
+
+ tokens.consume("TO");
+ List<String> grantees = new ArrayList<String>();
+
+ do {
+ String grantee = parseName(tokens);
+ grantees.add(grantee);
+ } while( tokens.canConsume(COMMA));
+
+ boolean withGrantOption = false;
+ if( tokens.canConsume("WITH", "GRANT", "OPTION")) {
+ withGrantOption = true;
+ }
+
+ // Set all properties and children on Grant Nodes
+ for( AstNode grantNode : grantNodes) {
+ List<AstNode> copyOfPrivileges = copyOfPrivileges(privileges);
+ // Attach privileges to grant node
+ for( AstNode node : copyOfPrivileges ) {
+ node.setParent(grantNode);
+ }
+ if( allPrivileges ) {
+ grantNode.setProperty(ALL_PRIVILEGES, allPrivileges);
+ }
+ for( String grantee : grantees) {
+ nodeFactory().node(grantee, grantNode, GRANTEE);
+ }
+
+ if( withGrantOption ) {
+ AstNode optionNode = nodeFactory().node("withGrant", grantNode, TYPE_STATEMENT_OPTION);
+ optionNode.setProperty(VALUE, "WITH GRANT OPTION");
+ }
+ }
+ AstNode firstGrantNode = grantNodes.get(0);
+
+ markEndOfStatement(tokens, firstGrantNode);
+
+ // Update additional grant nodes with statement info
+
+ for( int i=1; i<grantNodes.size(); i++) {
+ AstNode grantNode = grantNodes.get(i);
+ grantNode.setProperty(DDL_EXPRESSION, firstGrantNode.getProperty(DDL_EXPRESSION));
+ grantNode.setProperty(DDL_START_LINE_NUMBER, firstGrantNode.getProperty(DDL_START_LINE_NUMBER));
+ grantNode.setProperty(DDL_START_CHAR_INDEX, firstGrantNode.getProperty(DDL_START_CHAR_INDEX));
+ grantNode.setProperty(DDL_START_COLUMN_NUMBER, firstGrantNode.getProperty(DDL_START_COLUMN_NUMBER));
+ }
+
+
+ return grantNodes.get(0);
}
+
+ /**
+ *
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.sequencer.ddl.StandardDdlParser#parseGrantPrivileges(org.jboss.dna.sequencer.ddl.DdlTokenStream, java.util.List)
+ */
@Override
+ protected void parseGrantPrivileges( DdlTokenStream tokens, List<AstNode> privileges) throws ParsingException {
+ // privilege-types
+ //
+ // ALL PRIVILEGES | privilege-list
+ //
+ // privilege-list
+ //
+ // table-privilege {, table-privilege }*
+ //
+ // table-privilege
+ // SELECT [ <left paren> <privilege column list> <right paren> ]
+ // | DELETE
+ // | INSERT [ <left paren> <privilege column list> <right paren> ]
+ // | UPDATE [ <left paren> <privilege column list> <right paren> ]
+ // | REFERENCES [ <left paren> <privilege column list> <right paren> ]
+ // | USAGE
+ // | TRIGGER
+ // | TRUNCATE
+ // | CREATE
+ // | CONNECT
+ // | TEMPORARY
+ // | TEMP
+ // | EXECUTE
+
+ // POSTGRES has the following Privileges:
+ // GRANT { { SELECT | INSERT | UPDATE | DELETE | TRUNCATE | REFERENCES | TRIGGER }
+
+ do {
+ AstNode node = null;
+
+ if( tokens.canConsume(DELETE)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, DELETE);
+ } else if( tokens.canConsume(INSERT)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, INSERT);
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ } else if( tokens.canConsume("REFERENCES")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "REFERENCES");
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ } else if( tokens.canConsume(SELECT)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, SELECT);
+ // Could have columns here
+ // GRANT SELECT (col1), UPDATE (col1) ON mytable TO miriam_rw;
+
+ // Let's just swallow the column data.
+
+ consumeParenBoundedTokens(tokens, true);
+ } else if( tokens.canConsume("USAGE")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "USAGE");
+ } else if( tokens.canConsume(UPDATE)) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, UPDATE);
+ parseColumnNameList(tokens, node, TYPE_COLUMN_REFERENCE);
+ } else if( tokens.canConsume("TRIGGER")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "TRIGGER");
+ } else if( tokens.canConsume("TRUNCATE")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "TRUNCATE");
+ } else if( tokens.canConsume("CREATE")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "CREATE");
+ } else if( tokens.canConsume("CONNECT")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "CONNECT");
+ } else if( tokens.canConsume("TEMPORARY")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "TEMPORARY");
+ } else if( tokens.canConsume("TEMP")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "TEMP");
+ } else if( tokens.canConsume("EXECUTE")) {
+ node = nodeFactory().node("privilege");
+ node.setProperty(TYPE, "EXECUTE");
+ }
+
+ if( node == null) {
+ break;
+ }
+ nodeFactory().setType(node, GRANT_PRIVILEGE);
+ privileges.add(node);
+
+ } while( tokens.canConsume(COMMA));
+
+ }
+
+ private List<AstNode> parseMultipleGrantTargets(DdlTokenStream tokens,
+ AstNode parentNode,
+ Name nodeType) throws ParsingException {
+ List<AstNode> grantNodes = new ArrayList<AstNode>();
+ String name = parseName(tokens);
+ AstNode grantNode = nodeFactory().node(name, parentNode, nodeType);
+ grantNodes.add(grantNode);
+ while( tokens.canConsume(COMMA) ) {
+ // Assume more names here
+ name = parseName(tokens);
+ grantNode = nodeFactory().node(name, parentNode, nodeType);
+ grantNodes.add(grantNode);
+ }
+
+ return grantNodes;
+ }
+
+ private List<AstNode> copyOfPrivileges(List<AstNode> privileges) {
+ List<AstNode> copyOfPrivileges = new ArrayList<AstNode>();
+ for( AstNode node : privileges) {
+ copyOfPrivileges.add(node.clone());
+ }
+
+ return copyOfPrivileges;
+ }
+
+ private List<AstNode> parseFunctionAndParameters( DdlTokenStream tokens,
+ AstNode parentNode ) throws ParsingException {
+ boolean isFirstFunction = true;
+ List<AstNode> grantNodes = new ArrayList<AstNode>();
+
+ // FUNCTION funcname ( [ [ argmode ] [ argname ] argtype [, ...] ] ) [, ...]
+
+ // argmode = [ IN, OUT, INOUT, or VARIADIC ]
+
+ // p(a int, b TEXT), q(integer, double)
+
+ // [postgresddl:grantOnFunctionStatement] > ddl:grantStatement, postgresddl:functionOperand mixin
+ // + * (postgresddl:functionParameter) = postgresddl:functionParameter multiple
+
+ do {
+ String name = parseName(tokens);
+ AstNode grantFunctionNode = nodeFactory().node(name, parentNode, TYPE_GRANT_ON_FUNCTION_STATEMENT);
+
+ grantNodes.add(grantFunctionNode);
+
+ // Parse Parameter Data
+ if( tokens.matches(L_PAREN)) {
+ tokens.consume(L_PAREN);
+
+ if( !tokens.canConsume(R_PAREN)) {
+ // check for datatype
+ do{
+ String mode = null;
+
+ if( tokens.matchesAnyOf("IN", "OUT", "INOUT", "VARIADIC")) {
+ mode = tokens.consume();
+ }
+ AstNode paramNode = null;
+
+ DataType dType = getDatatypeParser().parse(tokens);
+ if( dType != null ) {
+ // NO Parameter Name, only DataType
+ paramNode = nodeFactory().node("parameter", grantFunctionNode, FUNCTION_PARAMETER);
+ if( mode != null ) {
+ paramNode.setProperty(FUNCTION_PARAMETER_MODE, mode);
+ }
+ getDatatypeParser().setPropertiesOnNode(paramNode, dType);
+ } else {
+ String paramName = parseName(tokens);
+ dType = getDatatypeParser().parse(tokens);
+ assert paramName != null;
+
+ paramNode = nodeFactory().node(paramName, grantFunctionNode, FUNCTION_PARAMETER);
+ if( mode != null ) {
+ paramNode.setProperty(FUNCTION_PARAMETER_MODE, mode);
+ }
+ if( dType != null ) {
+ getDatatypeParser().setPropertiesOnNode(paramNode, dType);
+ }
+ }
+ } while( tokens.canConsume(COMMA));
+
+ tokens.consume(R_PAREN);
+ }
+ }
+
+ // RESET first parameter flag
+ if( isFirstFunction ) {
+ isFirstFunction = false;
+ }
+ } while( tokens.canConsume(COMMA) );
+
+
+ return grantNodes;
+ }
+
+ @Override
protected AstNode parseSetStatement( DdlTokenStream tokens,
AstNode parentNode ) throws ParsingException {
assert tokens != null;
Modified: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.properties
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.properties 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/DdlSequencerI18n.properties 2010-01-05 13:31:47 UTC (rev 1527)
@@ -29,4 +29,5 @@
unusedTokensParsingColumnsAndConstraints = The following unused tokens were found parsing columns and constraints for table: {0}.
unusedTokensParsingColumnDefinition = The following unused tokens were found parsing a column definition for table: {0}.
alterTableOptionNotFound = ALTER TABLE Option not found. Check your DDL for incomplete statement near line {0}, column {1}
-unusedTokensParsingCreateIndex = The following unused tokens were found parsing a create index statement: {0}.
\ No newline at end of file
+unusedTokensParsingCreateIndex = The following unused tokens were found parsing a create index statement: {0}.
+missingReturnTypeForFunction = The function {0} is missing a return data type.
\ No newline at end of file
Deleted: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
@@ -1,266 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-
- //------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-
-
-//------------------------------------------------------------------------------
-// N O D E T Y P E S
-//------------------------------------------------------------------------------
-
-[ddl:operation] mixin abstract
-[ddl:operand] mixin abstract
- - ddl:name (STRING) mandatory
-
-// =============================================================================
-// STATEMENT
-// =============================================================================
-[ddl:statement] mixin abstract
- - ddl:expression (string) mandatory // The string fragment encompassing the statement expression.
- - ddl:originalExpression (string) mandatory // The string fragment encompassing the original statement expression.
- - ddl:startLineNumber (long) mandatory // The starting line number for the statement
- - ddl:startColumnNumber (long) mandatory // The starting column number for the statement
- - ddl:startCharIndex (long) mandatory // The starting content character index for the statement
- - ddl:length (long) mandatory // The string length
- + ddl:problem (ddl:ddlProblem) = ddl:ddlProblem multiple // Problems encountered during parsing.
-
-// =============================================================================
-// CREATE, ALTER, DROP, INSERT, SET, GRANT, REVOKE
-// =============================================================================
-[ddl:creatable] > ddl:operation abstract
-[ddl:alterable] > ddl:operation abstract
-[ddl:droppable] > ddl:operation abstract
- - ddl:dropBehavior (STRING)
- + ddl:dropOption (ddl:statementOption) = ddl:statementOption multiple
-[ddl:insertable] > ddl:operation abstract
-[ddl:settable] > ddl:operation abstract
-[ddl:grantable] > ddl:operation abstract
-[ddl:revokable] > ddl:operation abstract
-[ddl:renamable] > ddl:operation, ddl:operand abstract
- - ddl:newName (STRING)
-
-// =============================================================================
-// OPERANDS: SCHEMA, TABLE, DOMAIN, VIEW, ASSERTION, CHARACTER SET, COLLATION, TRANSLATION
-// =============================================================================
-[ddl:schemaOperand] > ddl:operand abstract
-[ddl:tableOperand] > ddl:operand abstract
-[ddl:domainOperand] > ddl:operand abstract
-[ddl:viewOperand] > ddl:operand abstract
-[ddl:assertionOperand] > ddl:operand abstract
-[ddl:characterSetOperand] > ddl:operand abstract
-[ddl:collationOperand] > ddl:operand abstract
-[ddl:translationOperand] > ddl:operand abstract
-[ddl:columnOperand] > ddl:operand abstract
-[ddl:tableConstraintOperand] > ddl:operand abstract
-[ddl:referenceOperand] > ddl:operand abstract
-
-// =============================================================================
-// COLUMN
-// =============================================================================
-[ddl:columnDefinition] > ddl:creatable, ddl:columnOperand mixin
- - ddl:datatypeName (STRING) mandatory
- - ddl:datatypeLength (LONG)
- - ddl:datatypePrecision (LONG)
- - ddl:datatypeScale (LONG)
- - ddl:nullable (STRING)
- - ddl:defaultOption (STRING)
- < 'LITERAL', 'DATETIME', 'USER', 'CURRENT_USER', 'SESSION_USER', 'SYSTEM_USER', 'NULL'
- - ddl:defaultValue (STRING)
- - ddl:defaultPrecision (LONG)
- - ddl:collationName (STRING)
- + ddl:dropBehavior (ddl:simpleProperty) = ddl:simpleProperty
- + ddl:columnAttribute (ddl:simpleProperty) = ddl:simpleProperty multiple
-
-// =============================================================================
-// TABLE CONSTRAINT
-// =============================================================================
-[ddl:tableConstraintDefinition] > ddl:creatable, ddl:tableConstraintOperand mixin
- - ddl:constraintType (STRING) mandatory
- < 'UNIQUE', 'PRIMARY KEY', 'FOREIGN KEY', 'CHECK'
- - ddl:deferrable (STRING)
- < 'DEFERRABLE', 'NOT DEFERRABLE'
- - ddl:checkSearchCondition (STRING)
- < 'INITIALLY DEFERRED', 'INITIALLY IMMEDIATE'
- + * (ddl:columnReference) = ddl:columnReference multiple
- + * (ddl:tableReference) = ddl:tableReference
- + * (ddl:fkColumnReference) = ddl:fkColumnReference multiple
- + ddl:constraintAttribute (ddl:simpleProperty) = ddl:simpleProperty multiple
-
-// =============================================================================
-// REFERENCE
-// =============================================================================
-[ddl:columnReference] > ddl:referenceOperand mixin
-[ddl:tableReference] > ddl:referenceOperand mixin
-[ddl:fkColumnReference] > ddl:referenceOperand mixin
-
-// =============================================================================
-// SIMPLE STRING PROPERTY
-// =============================================================================
-[ddl:simpleProperty] mixin
- - ddl:propValue (STRING) mandatory
-
-// =============================================================================
-// STATEMENT OPTION
-// =============================================================================
-[ddl:statementOption] mixin
- - ddl:value (STRING) mandatory
-
-// =============================================================================
-// DDL PROBLEM
-// =============================================================================
-[ddl:ddlProblem] mixin
- - ddl:problemLevel (LONG) mandatory
- - ddl:message (STRING) mandatory
-
-// =============================================================================
-// CREATE SCHEMA
-// =============================================================================
-[ddl:schemaDefinition] > ddl:statement, ddl:creatable, ddl:schemaOperand mixin
- - ddl:defaultCharacterSetName (STRING)
- + * (ddl:statement) = ddl:statement multiple
-
-// =============================================================================
-// CREATE TABLE
-// =============================================================================
-[ddl:createTableStatement] > ddl:statement, ddl:creatable, ddl:tableOperand mixin
- - ddl:temporary (STRING)
- < 'GLOBAL', 'LOCAL'
- - ddl:onCommitValue (STRING)
- < 'DELETE ROWS', 'PRESERVE ROWS'
- + * (ddl:columnDefinition) = ddl:columnDefinition multiple
- + * (ddl:tableConstraintDefinition) = ddl:tableConstraintDefinition multiple
- + * (ddl:statementOption) = ddl:statementOption multiple
-
-// =============================================================================
-// CREATE VIEW
-// =============================================================================
-[ddl:createViewStatement] > ddl:statement, ddl:creatable, ddl:viewOperand mixin
- - ddl:sqlExpression (STRING) mandatory
- - ddl:checkOption (STRING)
- + * (ddl:columnReference) = ddl:columnReference multiple
-
-// =============================================================================
-// CREATE DOMAIN
-// =============================================================================
-[ddl:createDomainStatement] > ddl:statement, ddl:creatable, ddl:domainOperand mixin
- - ddl:datatypeName (STRING) mandatory
- - ddl:datatypeLength (LONG)
- - ddl:datatypePrecision (LONG)
- - ddl:datatypeScale (LONG)
- - ddl:nullable (STRING)
- - ddl:defaultOption (STRING)
- < 'LITERAL', 'DATETIME', 'USER', 'CURRENT_USER', 'SESSION_USER', 'SYSTEM_USER', 'NULL'
- - ddl:defaultValue (STRING)
- - ddl:defaultPrecision (LONG)
- - ddl:collationName (STRING)
- + ddl:domainConstraintDefinition (ddl:tableConstraintDefinition) = ddl:tableConstraintDefinition multiple
-
-// =============================================================================
-// CREATE ASSERTION
-// =============================================================================
-[ddl:createAssertionStatement] > ddl:statement, ddl:creatable, ddl:assertionOperand mixin
- - ddl:constraintName (STRING) mandatory
- - ddl:searchCondition (STRING) mandatory
- + ddl:constraintAttribute (ddl:simpleProperty) = ddl:simpleProperty multiple
-
-// =============================================================================
-// CREATE CHARACTER SET
-// =============================================================================
-[ddl:createCharacterSetStatement] > ddl:statement, ddl:creatable, ddl:characterSetOperand mixin
- - ddl:existingName (STRING) mandatory
- - ddl:collateClause (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
- - ddl:limitedCollationDefinition (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
-
-// =============================================================================
-// CREATE COLLATION
-// =============================================================================
-[ddl:createCollationStatement] > ddl:statement, ddl:creatable, ddl:collationOperand mixin
- - ddl:characterSetName (STRING) mandatory
- - ddl:collationSource (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
- - ddl:padAttribute (STRING)
- < 'NO PAD', 'PAD SPACE'
-
-// =============================================================================
-// CREATE TRANSLATION
-// =============================================================================
-[ddl:createTranslationStatement] > ddl:statement, ddl:creatable, ddl:translationOperand mixin
- - ddl:sourceCharacterSetName (STRING) mandatory
- - ddl:targetCharacterSetName (STRING) mandatory
- - ddl:translationSource (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
-
-// =============================================================================
-// ALTER TABLE
-// =============================================================================
-[ddl:alterTableStatement] > ddl:statement, ddl:alterable, ddl:tableOperand mixin
- + * (ddl:addColumnDefinition) = ddl:addColumnDefinition multiple
- + * (ddl:dropColumnDefinition) = ddl:dropColumnDefinition multiple
- + * (ddl:alterColumnDefinition) = ddl:alterColumnDefinition multiple
- + * (ddl:addTableConstraintDefinition) = ddl:addTableConstraintDefinition multiple
- + * (ddl:dropTableConstraintDefinition) = ddl:dropTableConstraintDefinition multiple
- + * (ddl:statementOption) = ddl:statementOption multiple
-
-// =============================================================================
-// ALTER DOMAIN
-// =============================================================================
-[ddl:alterDomainStatement] > ddl:statement, ddl:alterable, ddl:domainOperand mixin
- - ddl:alterDomainAction (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
-
-// =============================================================================
-// DROP STATEMENTS
-// =============================================================================
-[ddl:dropSchemaStatement] > ddl:statement, ddl:droppable, ddl:schemaOperand mixin
-[ddl:dropTableStatement] > ddl:statement, ddl:droppable, ddl:tableOperand mixin
-[ddl:dropViewStatement] > ddl:statement, ddl:droppable, ddl:viewOperand mixin
-[ddl:dropDomainStatement] > ddl:statement, ddl:droppable, ddl:domainOperand mixin
-[ddl:dropCharacterSetStatement] > ddl:statement, ddl:droppable, ddl:characterSetOperand mixin
-[ddl:dropCollationStatement] > ddl:statement, ddl:droppable, ddl:collationOperand mixin
-[ddl:dropTranslationStatement] > ddl:statement, ddl:droppable, ddl:translationOperand mixin
-[ddl:dropAssertionStatement] > ddl:statement, ddl:droppable, ddl:assertionOperand mixin
-
-[ddl:alterColumnDefinition] > ddl:columnDefinition, ddl:alterable mixin
-[ddl:addColumnDefinition] > ddl:columnDefinition, ddl:creatable mixin
-[ddl:dropColumnDefinition] > ddl:columnDefinition, ddl:droppable mixin
-[ddl:addTableConstraintDefinition] > ddl:tableConstraintDefinition, ddl:creatable mixin
-[ddl:dropTableConstraintDefinition] > ddl:tableConstraintDefinition, ddl:droppable mixin
-
-// =============================================================================
-// MISC STATEMENTS
-// =============================================================================
-[ddl:setStatement] > ddl:statement, ddl:settable mixin
- // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
-
-[ddl:insertStatement] > ddl:statement, ddl:insertable mixin
- // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
-
-[ddl:grantStatement] > ddl:statement, ddl:grantable mixin
- // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
-
-
-
Added: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd (rev 0)
+++ trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
@@ -0,0 +1,280 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+
+ //------------------------------------------------------------------------------
+// N A M E S P A C E S
+//------------------------------------------------------------------------------
+<jcr='http://www.jcp.org/jcr/1.0'>
+<nt='http://www.jcp.org/jcr/nt/1.0'>
+<mix='http://www.jcp.org/jcr/mix/1.0'>
+<ddl='http://www.jboss.org/dna/ddl/1.0'>
+
+
+//------------------------------------------------------------------------------
+// N O D E T Y P E S
+//------------------------------------------------------------------------------
+
+[ddl:operation] mixin abstract
+[ddl:operand] mixin abstract
+ - ddl:name (STRING) mandatory
+
+// =============================================================================
+// STATEMENT
+// =============================================================================
+[ddl:statement] mixin abstract
+ - ddl:expression (string) mandatory // The string fragment encompassing the statement expression.
+ - ddl:originalExpression (string) mandatory // The string fragment encompassing the original statement expression.
+ - ddl:startLineNumber (long) mandatory // The starting line number for the statement
+ - ddl:startColumnNumber (long) mandatory // The starting column number for the statement
+ - ddl:startCharIndex (long) mandatory // The starting content character index for the statement
+ - ddl:length (long) mandatory // The string length
+ + ddl:problem (ddl:ddlProblem) = ddl:ddlProblem multiple // Problems encountered during parsing.
+
+// =============================================================================
+// CREATE, ALTER, DROP, INSERT, SET, GRANT, REVOKE
+// =============================================================================
+[ddl:creatable] > ddl:operation abstract
+[ddl:alterable] > ddl:operation abstract
+[ddl:droppable] > ddl:operation abstract
+ - ddl:dropBehavior (STRING)
+ + ddl:dropOption (ddl:statementOption) = ddl:statementOption multiple
+[ddl:insertable] > ddl:operation abstract
+[ddl:settable] > ddl:operation abstract
+[ddl:grantable] > ddl:operation abstract
+[ddl:revokable] > ddl:operation abstract
+[ddl:renamable] > ddl:operation abstract
+ - ddl:newName (STRING)
+
+// =============================================================================
+// OPERANDS: SCHEMA, TABLE, DOMAIN, VIEW, ASSERTION, CHARACTER SET, COLLATION, TRANSLATION
+// =============================================================================
+[ddl:schemaOperand] > ddl:operand abstract
+[ddl:tableOperand] > ddl:operand abstract
+[ddl:domainOperand] > ddl:operand abstract
+[ddl:viewOperand] > ddl:operand abstract
+[ddl:assertionOperand] > ddl:operand abstract
+[ddl:characterSetOperand] > ddl:operand abstract
+[ddl:collationOperand] > ddl:operand abstract
+[ddl:translationOperand] > ddl:operand abstract
+[ddl:columnOperand] > ddl:operand abstract
+[ddl:tableConstraintOperand] > ddl:operand abstract
+[ddl:referenceOperand] > ddl:operand abstract
+
+// =============================================================================
+// COLUMN
+// =============================================================================
+[ddl:columnDefinition] > ddl:creatable, ddl:columnOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:nullable (STRING)
+ - ddl:defaultOption (STRING)
+ < 'LITERAL', 'DATETIME', 'USER', 'CURRENT_USER', 'SESSION_USER', 'SYSTEM_USER', 'NULL'
+ - ddl:defaultValue (STRING)
+ - ddl:defaultPrecision (LONG)
+ - ddl:collationName (STRING)
+ + ddl:dropBehavior (ddl:simpleProperty) = ddl:simpleProperty
+ + ddl:columnAttribute (ddl:simpleProperty) = ddl:simpleProperty multiple
+
+// =============================================================================
+// TABLE CONSTRAINT
+// =============================================================================
+[ddl:tableConstraintDefinition] > ddl:creatable, ddl:tableConstraintOperand mixin
+ - ddl:constraintType (STRING) mandatory
+ < 'UNIQUE', 'PRIMARY KEY', 'FOREIGN KEY', 'CHECK'
+ - ddl:deferrable (STRING)
+ < 'DEFERRABLE', 'NOT DEFERRABLE'
+ - ddl:checkSearchCondition (STRING)
+ < 'INITIALLY DEFERRED', 'INITIALLY IMMEDIATE'
+ + * (ddl:columnReference) = ddl:columnReference multiple
+ + * (ddl:tableReference) = ddl:tableReference
+ + * (ddl:fkColumnReference) = ddl:fkColumnReference multiple
+ + ddl:constraintAttribute (ddl:simpleProperty) = ddl:simpleProperty multiple
+
+// =============================================================================
+// REFERENCE
+// =============================================================================
+[ddl:columnReference] > ddl:referenceOperand mixin
+[ddl:tableReference] > ddl:referenceOperand mixin
+[ddl:fkColumnReference] > ddl:referenceOperand mixin
+[ddl:grantee] > ddl:referenceOperand mixin
+
+// =============================================================================
+// SIMPLE STRING PROPERTY
+// =============================================================================
+[ddl:simpleProperty] mixin
+ - ddl:propValue (STRING) mandatory
+
+// =============================================================================
+// STATEMENT OPTION
+// =============================================================================
+[ddl:statementOption] mixin
+ - ddl:value (STRING) mandatory
+
+// =============================================================================
+// DDL PROBLEM
+// =============================================================================
+[ddl:ddlProblem] mixin
+ - ddl:problemLevel (LONG) mandatory
+ - ddl:message (STRING) mandatory
+
+// =============================================================================
+// CREATE SCHEMA
+// =============================================================================
+[ddl:schemaDefinition] > ddl:statement, ddl:creatable, ddl:schemaOperand mixin
+ - ddl:defaultCharacterSetName (STRING)
+ + * (ddl:statement) = ddl:statement multiple
+
+// =============================================================================
+// CREATE TABLE
+// =============================================================================
+[ddl:createTableStatement] > ddl:statement, ddl:creatable, ddl:tableOperand mixin
+ - ddl:temporary (STRING)
+ < 'GLOBAL', 'LOCAL'
+ - ddl:onCommitValue (STRING)
+ < 'DELETE ROWS', 'PRESERVE ROWS'
+ + * (ddl:columnDefinition) = ddl:columnDefinition multiple
+ + * (ddl:tableConstraintDefinition) = ddl:tableConstraintDefinition multiple
+ + * (ddl:statementOption) = ddl:statementOption multiple
+
+// =============================================================================
+// CREATE VIEW
+// =============================================================================
+[ddl:createViewStatement] > ddl:statement, ddl:creatable, ddl:viewOperand mixin
+ - ddl:sqlExpression (STRING) mandatory
+ - ddl:checkOption (STRING)
+ + * (ddl:columnReference) = ddl:columnReference multiple
+
+// =============================================================================
+// CREATE DOMAIN
+// =============================================================================
+[ddl:createDomainStatement] > ddl:statement, ddl:creatable, ddl:domainOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:nullable (STRING)
+ - ddl:defaultOption (STRING)
+ < 'LITERAL', 'DATETIME', 'USER', 'CURRENT_USER', 'SESSION_USER', 'SYSTEM_USER', 'NULL'
+ - ddl:defaultValue (STRING)
+ - ddl:defaultPrecision (LONG)
+ - ddl:collationName (STRING)
+ + ddl:domainConstraintDefinition (ddl:tableConstraintDefinition) = ddl:tableConstraintDefinition multiple
+
+// =============================================================================
+// CREATE ASSERTION
+// =============================================================================
+[ddl:createAssertionStatement] > ddl:statement, ddl:creatable, ddl:assertionOperand mixin
+ - ddl:constraintName (STRING) mandatory
+ - ddl:searchCondition (STRING) mandatory
+ + ddl:constraintAttribute (ddl:simpleProperty) = ddl:simpleProperty multiple
+
+// =============================================================================
+// CREATE CHARACTER SET
+// =============================================================================
+[ddl:createCharacterSetStatement] > ddl:statement, ddl:creatable, ddl:characterSetOperand mixin
+ - ddl:existingName (STRING) mandatory
+ - ddl:collateClause (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+ - ddl:limitedCollationDefinition (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+
+// =============================================================================
+// CREATE COLLATION
+// =============================================================================
+[ddl:createCollationStatement] > ddl:statement, ddl:creatable, ddl:collationOperand mixin
+ - ddl:characterSetName (STRING) mandatory
+ - ddl:collationSource (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+ - ddl:padAttribute (STRING)
+ < 'NO PAD', 'PAD SPACE'
+
+// =============================================================================
+// CREATE TRANSLATION
+// =============================================================================
+[ddl:createTranslationStatement] > ddl:statement, ddl:creatable, ddl:translationOperand mixin
+ - ddl:sourceCharacterSetName (STRING) mandatory
+ - ddl:targetCharacterSetName (STRING) mandatory
+ - ddl:translationSource (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+
+// =============================================================================
+// ALTER TABLE
+// =============================================================================
+[ddl:alterTableStatement] > ddl:statement, ddl:alterable, ddl:tableOperand mixin
+ + * (ddl:addColumnDefinition) = ddl:addColumnDefinition multiple
+ + * (ddl:dropColumnDefinition) = ddl:dropColumnDefinition multiple
+ + * (ddl:alterColumnDefinition) = ddl:alterColumnDefinition multiple
+ + * (ddl:addTableConstraintDefinition) = ddl:addTableConstraintDefinition multiple
+ + * (ddl:dropTableConstraintDefinition) = ddl:dropTableConstraintDefinition multiple
+ + * (ddl:statementOption) = ddl:statementOption multiple
+
+// =============================================================================
+// ALTER DOMAIN
+// =============================================================================
+[ddl:alterDomainStatement] > ddl:statement, ddl:alterable, ddl:domainOperand mixin
+ - ddl:alterDomainAction (STRING) // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+
+// =============================================================================
+// DROP STATEMENTS
+// =============================================================================
+[ddl:dropSchemaStatement] > ddl:statement, ddl:droppable, ddl:schemaOperand mixin
+[ddl:dropTableStatement] > ddl:statement, ddl:droppable, ddl:tableOperand mixin
+[ddl:dropViewStatement] > ddl:statement, ddl:droppable, ddl:viewOperand mixin
+[ddl:dropDomainStatement] > ddl:statement, ddl:droppable, ddl:domainOperand mixin
+[ddl:dropCharacterSetStatement] > ddl:statement, ddl:droppable, ddl:characterSetOperand mixin
+[ddl:dropCollationStatement] > ddl:statement, ddl:droppable, ddl:collationOperand mixin
+[ddl:dropTranslationStatement] > ddl:statement, ddl:droppable, ddl:translationOperand mixin
+[ddl:dropAssertionStatement] > ddl:statement, ddl:droppable, ddl:assertionOperand mixin
+
+[ddl:alterColumnDefinition] > ddl:columnDefinition, ddl:alterable mixin
+[ddl:addColumnDefinition] > ddl:columnDefinition, ddl:creatable mixin
+[ddl:dropColumnDefinition] > ddl:columnDefinition, ddl:droppable mixin
+[ddl:addTableConstraintDefinition] > ddl:tableConstraintDefinition, ddl:creatable mixin
+[ddl:dropTableConstraintDefinition] > ddl:tableConstraintDefinition, ddl:droppable mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+[ddl:setStatement] > ddl:statement, ddl:settable mixin
+ // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+
+[ddl:insertStatement] > ddl:statement, ddl:insertable mixin
+ // TODO: THIS IS COMPLEX, NEED TO BREAK DOWN
+
+// =============================================================================
+// GRANT STATEMENTS
+// =============================================================================
+
+[ddl:grantPrivilege] mixin
+ - ddl:type (STRING) mandatory
+ + * (ddl:columnReference) = ddl:columnReference multiple
+
+[ddl:grantStatement] > ddl:statement, ddl:grantable mixin
+ - ddl:allPrivileges (boolean)
+ + * (ddl:grantPrivilege) = ddl:grantPrivilege multiple
+ + * (ddl:grantee) = ddl:grantee multiple
+
+[ddl:grantOnTableStatement] > ddl:grantStatement, ddl:tableOperand mixin
+[ddl:grantOnDomainStatement] > ddl:grantStatement, ddl:domainOperand mixin
+[ddl:grantOnCollationStatement] > ddl:grantStatement, ddl:collationOperand mixin
+[ddl:grantOnCharacterSetStatement] > ddl:grantStatement, ddl:characterSetOperand mixin
+[ddl:grantOnTranslationStatement] > ddl:grantStatement, ddl:translationOperand mixin
Property changes on: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/StandardDdl.cnd
___________________________________________________________________
Name: svn:executable
+ *
Deleted: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
@@ -1,72 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-
- //------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-<derbyddl='http://www.jboss.org/dna/ddl/derby/1.0'>
-
-// =============================================================================
-// OPERANDS
-// =============================================================================
-[derbyddl:functionOperand] > ddl:operand abstract
-[derbyddl:indexOperand] > ddl:operand abstract
-[derbyddl:procedureOperand] > ddl:operand abstract
-[derbyddl:roleOperand] > ddl:operand abstract
-[derbyddl:synonymOperand] > ddl:operand abstract
-[derbyddl:triggerOperand] > ddl:operand abstract
-
-// =============================================================================
-// COLUMN
-// =============================================================================
-[derbyddl:columnDefinition] > ddl:columnDefinition mixin
- - derbyddl:dropDefault (boolean)
-
-// =============================================================================
-// CREATE STATEMENTS
-// =============================================================================
-[derbyddl:createFunctionStatement] > ddl:creatable, ddl:statement, derbyddl:functionOperand mixin
-[derbyddl:createIndex] > ddl:statement, ddl:creatable, derbyddl:indexOperand mixin
- - derbyddl:tableName (string) mandatory
- - derbyddl:unique (boolean)
- + * (ddl:columnReference) = ddl:columnReference multiple
-[derbyddl:createProcedureStatement] > ddl:creatable, ddl:statement, derbyddl:procedureOperand mixin
-[derbyddl:createRoleStatement] > ddl:creatable, ddl:statement, derbyddl:roleOperand mixin
-[derbyddl:createSynonymStatement] > ddl:creatable, ddl:statement, derbyddl:synonymOperand mixin
-[derbyddl:createTriggerStatement] > ddl:creatable, ddl:statement, derbyddl:triggerOperand mixin
-
-
-// =============================================================================
-// DROP STATEMENTS
-// =============================================================================
-[derbyddl:dropFunctionStatement] > ddl:droppable, derbyddl:functionOperand mixin
-[derbyddl:dropIndexStatement] > ddl:droppable, derbyddl:indexOperand mixin
-[derbyddl:dropProcedureStatement] > ddl:droppable, derbyddl:procedureOperand mixin
-[derbyddl:dropRoleStatement] > ddl:droppable, derbyddl:roleOperand mixin
-[derbyddl:dropSynonymStatement] > ddl:droppable, derbyddl:synonymOperand mixin
-[derbyddl:dropTriggerStatement] > ddl:droppable, derbyddl:triggerOperand mixin
Added: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd (rev 0)
+++ trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
@@ -0,0 +1,102 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+
+ //------------------------------------------------------------------------------
+// N A M E S P A C E S
+//------------------------------------------------------------------------------
+<jcr='http://www.jcp.org/jcr/1.0'>
+<nt='http://www.jcp.org/jcr/nt/1.0'>
+<mix='http://www.jcp.org/jcr/mix/1.0'>
+<ddl='http://www.jboss.org/dna/ddl/1.0'>
+<derbyddl='http://www.jboss.org/dna/ddl/derby/1.0'>
+
+// =============================================================================
+// OPERANDS
+// =============================================================================
+[derbyddl:functionOperand] > ddl:operand abstract
+[derbyddl:indexOperand] > ddl:operand abstract
+[derbyddl:procedureOperand] > ddl:operand abstract
+[derbyddl:roleOperand] > ddl:operand abstract
+[derbyddl:synonymOperand] > ddl:operand abstract
+[derbyddl:triggerOperand] > ddl:operand abstract
+
+[derbyddl:roleName] > derbyddl:roleOperand mixin
+
+// =============================================================================
+// COLUMN
+// =============================================================================
+[derbyddl:columnDefinition] > ddl:columnDefinition mixin
+ - derbyddl:dropDefault (boolean)
+
+[derbyddl:functionParameter] > ddl:columnDefinition mixin
+
+[derbyddl:indexColumnReference] > ddl:columnReference mixin
+ - derbyddl:order (STRING)
+
+// =============================================================================
+// CREATE STATEMENTS
+// =============================================================================
+[derbyddl:createFunctionStatement] > ddl:creatable, ddl:statement, derbyddl:functionOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:isTableType (boolean)
+ + * (derbyddl:functionParameter) = derbyddl:functionParameter multiple
+ + * (ddl:statementOption) = ddl:statementOption multiple
+[derbyddl:createIndexStatement] > ddl:statement, ddl:creatable, derbyddl:indexOperand mixin
+ - derbyddl:tableName (string) mandatory
+ - derbyddl:unique (boolean)
+ + * (derbyddl:indexColumnReference) = derbyddl:indexColumnReference multiple
+[derbyddl:createProcedureStatement] > ddl:creatable, ddl:statement, derbyddl:procedureOperand mixin
+[derbyddl:createRoleStatement] > ddl:creatable, ddl:statement, derbyddl:roleOperand mixin
+[derbyddl:createSynonymStatement] > ddl:creatable, ddl:statement, derbyddl:synonymOperand mixin
+ - derbyddl:tableName (string) mandatory
+[derbyddl:createTriggerStatement] > ddl:creatable, ddl:statement, derbyddl:triggerOperand mixin
+ - derbyddl:tableName (string) mandatory
+ - ddl:sql (string) mandatory
+ + * (ddl:columnReference) = ddl:columnreference multiple
+[derbyddl:declareGlobalTemporaryTableStatement] > ddl:createTableStatement mixin
+
+// =============================================================================
+// DROP STATEMENTS
+// =============================================================================
+[derbyddl:dropFunctionStatement] > ddl:droppable, derbyddl:functionOperand mixin
+[derbyddl:dropIndexStatement] > ddl:droppable, derbyddl:indexOperand mixin
+[derbyddl:dropProcedureStatement] > ddl:droppable, derbyddl:procedureOperand mixin
+[derbyddl:dropRoleStatement] > ddl:droppable, derbyddl:roleOperand mixin
+[derbyddl:dropSynonymStatement] > ddl:droppable, derbyddl:synonymOperand mixin
+[derbyddl:dropTriggerStatement] > ddl:droppable, derbyddl:triggerOperand mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+[derbyddl:lockTableStatement] > ddl:statement, ddl:tableOperand mixin
+[derbyddl:renameTableStatement] > ddl:statement, ddl:renamable, ddl:tableOperand mixin
+
+[derbyddl:grantOnFunctionStatement] > ddl:grantStatement, derbyddl:functionOperand mixin
+[derbyddl:grantOnProcedureStatement] > ddl:grantStatement, derbyddl:procedureOperand mixin
+
+[derbyddl:grantRolesStatement] > ddl:grantStatement mixin
+ + ddl:name (derbyddl:roleName) = derbyddl:roleName multiple
\ No newline at end of file
Property changes on: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdl.cnd
___________________________________________________________________
Name: svn:executable
+ *
Deleted: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
@@ -1,178 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-
- //------------------------------------------------------------------------------
-// N A M E S P A C E S
-//------------------------------------------------------------------------------
-<jcr='http://www.jcp.org/jcr/1.0'>
-<nt='http://www.jcp.org/jcr/nt/1.0'>
-<mix='http://www.jcp.org/jcr/mix/1.0'>
-<ddl='http://www.jboss.org/dna/ddl/1.0'>
-<postgresddl='http://www.jboss.org/dna/ddl/postgres/1.0'>
-
-// =============================================================================
-// OPERANDS
-// =============================================================================
-[postgresddl:aggregateOperand] > ddl:operand abstract
-[postgresddl:castOperand] > ddl:operand abstract
-[postgresddl:commentOperand] > ddl:operand abstract
-[postgresddl:constraintTriggerOperand] > ddl:operand abstract
-[postgresddl:conversionOperand] > ddl:operand abstract
-[postgresddl:databaseOperand] > ddl:operand abstract
-[postgresddl:foreignDataOperand] > ddl:operand abstract
-[postgresddl:groupOperand] > ddl:operand abstract
-[postgresddl:functionOperand] > ddl:operand abstractmandatory
-[postgresddl:indexOperand] > ddl:operand abstract
-[postgresddl:languageOperand] > ddl:operand abstract
-[postgresddl:operatorOperand] > ddl:operand abstract
-[postgresddl:ownedByOperand] > ddl:operand abstract
-[postgresddl:roleOperand] > ddl:operand abstract
-[postgresddl:ruleOperand] > ddl:operand abstract
-[postgresddl:sequenceOperand] > ddl:operand abstract
-[postgresddl:serverOperand] > ddl:operand abstract
-[postgresddl:tablespaceOperand] > ddl:operand abstract
-[postgresddl:textSearchOperand] > ddl:operand abstract
-[postgresddl:triggerOperand] > ddl:operand abstract
-[postgresddl:typeOperand] > ddl:operand abstract
-[postgresddl:userOperand] > ddl:operand abstract
-[postgresddl:userMappingOperand] > ddl:operand abstract
-
-
-// =============================================================================
-// ALTER STATEMENTS
-// =============================================================================
-[postgresddl:alterAggregateStatement] > ddl:alterable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:alterConversionStatement] > ddl:alterable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:alterForeignDataWrapperStatement] > ddl:alterable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:alterFunctionStatement] > ddl:alterable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:alterGroupStatement] > ddl:alterable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:alterIndexStatement] > ddl:alterable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:alterLanguageStatement] > ddl:alterable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:alterOperatorStatement] > ddl:alterable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:alterRoleStatement] > ddl:alterable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:alterSchemaStatement] > ddl:alterable, ddl:statement, ddl:schemaOperand mixin
-[postgresddl:alterSequenceStatement] > ddl:alterable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:alterServerStatement] > ddl:alterable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:alterTextSearchStatement] > ddl:alterable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:alterTriggerStatement] > ddl:alterable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:alterTypeStatement] > ddl:alterable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:alterUserStatement] > ddl:alterable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:alterUserMappingStatement] > ddl:alterable, ddl:statement, postgresddl:userMappingOperand mixin
-[postgresddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
-
-[postgresddl:alterTableStatement] > ddl:alterTableStatement mixin
- - postgresddl:newTableName (STRING)
- - postgresddl:schemaName (STRING)
- + postgresddl:renameColumn (ddl:renamable) = ddl:renamable multiple
-
-
-// =============================================================================
-// CREATE STATEMENTS
-// =============================================================================
-
-[postgresddl:createAggregateStatement] > ddl:creatable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:createCastStatement] > ddl:creatable, ddl:statement, postgresddl:castOperand mixin
-[postgresddl:createConstraintTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:constraintTriggerOperand mixin
-[postgresddl:createConversionStatement] > ddl:creatable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:createDatabaseStatement] > ddl:creatable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:createForeignDataWrapperStatement] > ddl:creatable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:createFunctionStatement] > ddl:creatable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:createGroupStatement] > ddl:creatable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:createIndexStatement] > ddl:creatable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:createLanguageStatement] > ddl:creatable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:createOperatorStatement] > ddl:creatable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:createRoleStatement] > ddl:creatable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:createRuleStatement] > ddl:creatable, ddl:statement, postgresddl:ruleOperand mixin
-[postgresddl:createSequenceStatement] > ddl:creatable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:createServerStatement] > ddl:creatable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:createTablespaceStatement] > ddl:creatable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:createTextSearchStatement] > ddl:creatable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:createTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:createTypeStatement] > ddl:creatable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:createUserStatement] > ddl:creatable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:createUserMappingStatement] > ddl:creatable, ddl:statement, postgresddl:userMappingOperand mixin
-
-// =============================================================================
-// DROP STATEMENTS
-// =============================================================================
-
-[postgresddl:dropAggregateStatement] > ddl:droppable, ddl:statement, postgresddl:aggregateOperand mixin
-[postgresddl:dropCastStatement] > ddl:droppable, ddl:statement, postgresddl:castOperand mixin
-[postgresddl:dropConstraintTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:constraintTriggerOperand mixin
-[postgresddl:dropConversionStatement] > ddl:droppable, ddl:statement, postgresddl:conversionOperand mixin
-[postgresddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, postgresddl:databaseOperand mixin
-[postgresddl:dropForeignDataWrapperStatement] > ddl:droppable, ddl:statement, postgresddl:foreignDataOperand mixin
-[postgresddl:dropFunctionStatement] > ddl:droppable, ddl:statement, postgresddl:functionOperand mixin
-[postgresddl:dropGroupStatement] > ddl:droppable, ddl:statement, postgresddl:groupOperand mixin
-[postgresddl:dropIndexStatement] > ddl:droppable, ddl:statement, postgresddl:indexOperand mixin
-[postgresddl:dropLanguageStatement] > ddl:droppable, ddl:statement, postgresddl:languageOperand mixin
-[postgresddl:dropOperatorStatement] > ddl:droppable, ddl:statement, postgresddl:operatorOperand mixin
-[postgresddl:dropOwnedByStatement] > ddl:droppable, ddl:statement, postgresddl:ownedByOperand mixin
-[postgresddl:dropRoleStatement] > ddl:droppable, ddl:statement, postgresddl:roleOperand mixin
-[postgresddl:dropRuleStatement] > ddl:droppable, ddl:statement, postgresddl:ruleOperand mixin
-[postgresddl:dropSequenceStatement] > ddl:droppable, ddl:statement, postgresddl:sequenceOperand mixin
-[postgresddl:dropServerStatement] > ddl:droppable, ddl:statement, postgresddl:serverOperand mixin
-[postgresddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, postgresddl:tablespaceOperand mixin
-[postgresddl:dropTextSearchStatement] > ddl:droppable, ddl:statement, postgresddl:textSearchOperand mixin
-[postgresddl:dropTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:triggerOperand mixin
-[postgresddl:dropTypeStatement] > ddl:droppable, ddl:statement, postgresddl:typeOperand mixin
-[postgresddl:dropUserStatement] > ddl:droppable, ddl:statement, postgresddl:userOperand mixin
-[postgresddl:dropUserMappingStatement] > ddl:droppable, ddl:statement, postgresddl:userMappingOperand mixin
-
-// =============================================================================
-// MISC STATEMENTS
-// =============================================================================
-
-[postgresddl:abortStatement] > ddl:statement mixin
-[postgresddl:analyzeStatement] > ddl:statement mixin
-[postgresddl:clusterStatement] > ddl:statement mixin
-[postgresddl:commentOnStatement] > ddl:statement, postgresddl:commentOperand mixin
- - postgresddl:targetObjectType (STRING) mandatory
- - postgresddl:targetObjectName (STRING)
- - postgresddl:comment (STRING) mandatory
-[postgresddl:copyStatement] > ddl:statement mixin
-[postgresddl:deallocateStatement] > ddl:statement mixin
-[postgresddl:declareStatement] > ddl:statement mixin
-[postgresddl:discardStatement] > ddl:statement mixin
-[postgresddl:explainStatement] > ddl:statement mixin
-[postgresddl:fetchStatement] > ddl:statement mixin
-[postgresddl:listenStatement] > ddl:statement mixin
-[postgresddl:loadStatement] > ddl:statement mixin
-[postgresddl:lockTableStatement] > ddl:statement mixin
-[postgresddl:moveStatement] > ddl:statement mixin
-[postgresddl:notifyStatement] > ddl:statement mixin
-[postgresddl:prepareStatement] > ddl:statement mixin
-[postgresddl:reassignOwnedStatement] > ddl:statement mixin
-[postgresddl:reindexStatement] > ddl:statement mixin
-[postgresddl:releaseSavepointStatement] > ddl:statement mixin
-[postgresddl:rollbackStatement] > ddl:statement mixin
-[postgresddl:selectIntoStatement] > ddl:statement mixin
-[postgresddl:showStatement] > ddl:statement mixin
-[postgresddl:truncateStatement] > ddl:statement mixin
-[postgresddl:unlistenStatement] > ddl:statement mixin
-[postgresddl:vacuumStatement] > ddl:statement mixin
-
-
Added: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd (rev 0)
+++ trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd 2010-01-05 13:31:47 UTC (rev 1527)
@@ -0,0 +1,205 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+
+ //------------------------------------------------------------------------------
+// N A M E S P A C E S
+//------------------------------------------------------------------------------
+<jcr='http://www.jcp.org/jcr/1.0'>
+<nt='http://www.jcp.org/jcr/nt/1.0'>
+<mix='http://www.jcp.org/jcr/mix/1.0'>
+<ddl='http://www.jboss.org/dna/ddl/1.0'>
+<postgresddl='http://www.jboss.org/dna/ddl/postgres/1.0'>
+
+// =============================================================================
+// OPERANDS
+// =============================================================================
+[postgresddl:aggregateOperand] > ddl:operand abstract
+[postgresddl:castOperand] > ddl:operand abstract
+[postgresddl:commentOperand] > ddl:operand abstract
+[postgresddl:constraintTriggerOperand] > ddl:operand abstract
+[postgresddl:conversionOperand] > ddl:operand abstract
+[postgresddl:databaseOperand] > ddl:operand abstract
+[postgresddl:foreignDataOperand] > ddl:operand abstract
+[postgresddl:groupOperand] > ddl:operand abstract
+[postgresddl:functionOperand] > ddl:operand abstract
+[postgresddl:indexOperand] > ddl:operand abstract
+[postgresddl:languageOperand] > ddl:operand abstract
+[postgresddl:operatorOperand] > ddl:operand abstract
+[postgresddl:ownedByOperand] > ddl:operand abstract
+[postgresddl:roleOperand] > ddl:operand abstract
+[postgresddl:ruleOperand] > ddl:operand abstract
+[postgresddl:sequenceOperand] > ddl:operand abstract
+[postgresddl:serverOperand] > ddl:operand abstract
+[postgresddl:tablespaceOperand] > ddl:operand abstract
+[postgresddl:textSearchOperand] > ddl:operand abstract
+[postgresddl:triggerOperand] > ddl:operand abstract
+[postgresddl:typeOperand] > ddl:operand abstract
+[postgresddl:userOperand] > ddl:operand abstract
+[postgresddl:userMappingOperand] > ddl:operand abstract
+[postgresddl:parameterOperand] > ddl:operand abstract
+
+[postgresddl:functionParameter] > postgresddl:parameterOperand mixin
+ - ddl:datatypeName (STRING) mandatory
+ - ddl:datatypeLength (LONG)
+ - ddl:datatypePrecision (LONG)
+ - ddl:datatypeScale (LONG)
+ - ddl:nullable (STRING)
+ - ddl:defaultOption (STRING)
+ - postgresddl:mode (STRING)
+
+[postgresddl:role] > postgresddl:roleOperand mixin
+
+[postgresddl:renamedColumn] > ddl:renamable mixin
+
+// =============================================================================
+// ALTER STATEMENTS
+// =============================================================================
+[postgresddl:alterAggregateStatement] > ddl:alterable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:alterConversionStatement] > ddl:alterable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:alterDatabaseStatement] > ddl:alterable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:alterForeignDataWrapperStatement] > ddl:alterable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:alterFunctionStatement] > ddl:alterable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:alterGroupStatement] > ddl:alterable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:alterIndexStatement] > ddl:alterable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:alterLanguageStatement] > ddl:alterable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:alterOperatorStatement] > ddl:alterable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:alterRoleStatement] > ddl:alterable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:alterSchemaStatement] > ddl:alterable, ddl:statement, ddl:schemaOperand mixin
+[postgresddl:alterSequenceStatement] > ddl:alterable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:alterServerStatement] > ddl:alterable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:alterTablespaceStatement] > ddl:alterable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:alterTextSearchStatement] > ddl:alterable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:alterTriggerStatement] > ddl:alterable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:alterTypeStatement] > ddl:alterable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:alterUserStatement] > ddl:alterable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:alterUserMappingStatement] > ddl:alterable, ddl:statement, postgresddl:userMappingOperand mixin
+[postgresddl:alterViewStatement] > ddl:alterable, ddl:statement, ddl:viewOperand mixin
+
+[postgresddl:alterTableStatement] > ddl:alterTableStatement mixin
+ - postgresddl:newTableName (STRING)
+ - postgresddl:schemaName (STRING)
+ + postgresddl:renameColumn (postgresddl:renamedColumn) = postgresddl:renamedColumn multiple
+
+
+// =============================================================================
+// CREATE STATEMENTS
+// =============================================================================
+
+[postgresddl:createAggregateStatement] > ddl:creatable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:createCastStatement] > ddl:creatable, ddl:statement, postgresddl:castOperand mixin
+[postgresddl:createConstraintTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:constraintTriggerOperand mixin
+[postgresddl:createConversionStatement] > ddl:creatable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:createDatabaseStatement] > ddl:creatable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:createForeignDataWrapperStatement] > ddl:creatable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:createFunctionStatement] > ddl:creatable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:createGroupStatement] > ddl:creatable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:createIndexStatement] > ddl:creatable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:createLanguageStatement] > ddl:creatable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:createOperatorStatement] > ddl:creatable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:createRoleStatement] > ddl:creatable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:createRuleStatement] > ddl:creatable, ddl:statement, postgresddl:ruleOperand mixin
+[postgresddl:createSequenceStatement] > ddl:creatable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:createServerStatement] > ddl:creatable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:createTablespaceStatement] > ddl:creatable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:createTextSearchStatement] > ddl:creatable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:createTriggerStatement] > ddl:creatable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:createTypeStatement] > ddl:creatable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:createUserStatement] > ddl:creatable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:createUserMappingStatement] > ddl:creatable, ddl:statement, postgresddl:userMappingOperand mixin
+
+// =============================================================================
+// DROP STATEMENTS
+// =============================================================================
+
+[postgresddl:dropAggregateStatement] > ddl:droppable, ddl:statement, postgresddl:aggregateOperand mixin
+[postgresddl:dropCastStatement] > ddl:droppable, ddl:statement, postgresddl:castOperand mixin
+[postgresddl:dropConstraintTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:constraintTriggerOperand mixin
+[postgresddl:dropConversionStatement] > ddl:droppable, ddl:statement, postgresddl:conversionOperand mixin
+[postgresddl:dropDatabaseStatement] > ddl:droppable, ddl:statement, postgresddl:databaseOperand mixin
+[postgresddl:dropForeignDataWrapperStatement] > ddl:droppable, ddl:statement, postgresddl:foreignDataOperand mixin
+[postgresddl:dropFunctionStatement] > ddl:droppable, ddl:statement, postgresddl:functionOperand mixin
+[postgresddl:dropGroupStatement] > ddl:droppable, ddl:statement, postgresddl:groupOperand mixin
+[postgresddl:dropIndexStatement] > ddl:droppable, ddl:statement, postgresddl:indexOperand mixin
+[postgresddl:dropLanguageStatement] > ddl:droppable, ddl:statement, postgresddl:languageOperand mixin
+[postgresddl:dropOperatorStatement] > ddl:droppable, ddl:statement, postgresddl:operatorOperand mixin
+[postgresddl:dropOwnedByStatement] > ddl:droppable, ddl:statement, postgresddl:ownedByOperand mixin
+[postgresddl:dropRoleStatement] > ddl:droppable, ddl:statement, postgresddl:roleOperand mixin
+[postgresddl:dropRuleStatement] > ddl:droppable, ddl:statement, postgresddl:ruleOperand mixin
+[postgresddl:dropSequenceStatement] > ddl:droppable, ddl:statement, postgresddl:sequenceOperand mixin
+[postgresddl:dropServerStatement] > ddl:droppable, ddl:statement, postgresddl:serverOperand mixin
+[postgresddl:dropTablespaceStatement] > ddl:droppable, ddl:statement, postgresddl:tablespaceOperand mixin
+[postgresddl:dropTextSearchStatement] > ddl:droppable, ddl:statement, postgresddl:textSearchOperand mixin
+[postgresddl:dropTriggerStatement] > ddl:droppable, ddl:statement, postgresddl:triggerOperand mixin
+[postgresddl:dropTypeStatement] > ddl:droppable, ddl:statement, postgresddl:typeOperand mixin
+[postgresddl:dropUserStatement] > ddl:droppable, ddl:statement, postgresddl:userOperand mixin
+[postgresddl:dropUserMappingStatement] > ddl:droppable, ddl:statement, postgresddl:userMappingOperand mixin
+
+// =============================================================================
+// MISC STATEMENTS
+// =============================================================================
+
+[postgresddl:abortStatement] > ddl:statement mixin
+[postgresddl:analyzeStatement] > ddl:statement mixin
+[postgresddl:clusterStatement] > ddl:statement mixin
+[postgresddl:commentOnStatement] > ddl:statement, postgresddl:commentOperand mixin
+ - postgresddl:targetObjectType (STRING) mandatory
+ - postgresddl:targetObjectName (STRING)
+ - postgresddl:comment (STRING) mandatory
+[postgresddl:copyStatement] > ddl:statement mixin
+[postgresddl:deallocateStatement] > ddl:statement mixin
+[postgresddl:declareStatement] > ddl:statement mixin
+[postgresddl:discardStatement] > ddl:statement mixin
+[postgresddl:explainStatement] > ddl:statement mixin
+[postgresddl:fetchStatement] > ddl:statement mixin
+[postgresddl:listenStatement] > ddl:statement mixin
+[postgresddl:loadStatement] > ddl:statement mixin
+[postgresddl:lockTableStatement] > ddl:statement mixin
+[postgresddl:moveStatement] > ddl:statement mixin
+[postgresddl:notifyStatement] > ddl:statement mixin
+[postgresddl:prepareStatement] > ddl:statement mixin
+[postgresddl:reassignOwnedStatement] > ddl:statement mixin
+[postgresddl:reindexStatement] > ddl:statement mixin
+[postgresddl:releaseSavepointStatement] > ddl:statement mixin
+[postgresddl:rollbackStatement] > ddl:statement mixin
+[postgresddl:selectIntoStatement] > ddl:statement mixin
+[postgresddl:showStatement] > ddl:statement mixin
+[postgresddl:truncateStatement] > ddl:statement mixin
+[postgresddl:unlistenStatement] > ddl:statement mixin
+[postgresddl:vacuumStatement] > ddl:statement mixin
+
+// =============================================================================
+// GRANT STATEMENTS
+// =============================================================================
+[postgresddl:grantOnTableStatement] > ddl:grantStatement, ddl:tableOperand mixin
+[postgresddl:grantOnSequenceStatement] > ddl:grantStatement, postgresddl:sequenceOperand mixin
+[postgresddl:grantOnDatabaseStatement] > ddl:grantStatement, postgresddl:databaseOperand mixin
+[postgresddl:grantOnForeignDataWrapperStatement] > ddl:grantStatement, postgresddl:foreignDataOperand mixin
+[postgresddl:grantOnForeignServerStatement] > ddl:grantStatement, postgresddl:serverOperand mixin
+[postgresddl:grantOnFunctionStatement] > ddl:grantStatement, postgresddl:functionOperand mixin
+ + postgresddl:parameter (postgresddl:functionParameter) = postgresddl:functionParameter multiple
+[postgresddl:grantOnLanguageStatement] > ddl:grantStatement, postgresddl:languageOperand mixin
+[postgresddl:grantOnSchemaStatement] > ddl:grantStatement, ddl:schemaOperand mixin
+[postgresddl:grantOnTablespaceStatement] > ddl:grantStatement, postgresddl:tablespaceOperand mixin
+[postgresddl:grantRolesStatement] > ddl:grantStatement mixin
+ + postgresddl:grantRole (postgresddl:role) = postgresddl:role multiple
Property changes on: trunk/extensions/dna-sequencer-ddl/src/main/resources/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdl.cnd
___________________________________________________________________
Name: svn:executable
+ *
Modified: trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlParserTestHelper.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlParserTestHelper.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlParserTestHelper.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -39,7 +39,7 @@
*/
public class DdlParserTestHelper implements DdlConstants {
private boolean printToConsole = false;
-
+ public final static String NEWLINE = "\n";
/**
*
Modified: trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlSequencerTest.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlSequencerTest.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/DdlSequencerTest.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -47,6 +47,7 @@
import org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon;
import org.junit.After;
import org.junit.Before;
+import org.junit.Ignore;
import org.junit.Test;
/**
@@ -488,6 +489,7 @@
assertThat(verifyMixinType(node_2, "ddl:columnDefinition"), is(true));
}
+ @Ignore
@Test
public void shouldSequenceDerbyDdl() throws IOException {
URL url = this.getClass().getClassLoader().getResource("ddl/dialect/derby/derby_test_statements.ddl");
Modified: trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/StandardDdlParserTest.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -790,7 +790,7 @@
boolean success = parser.parse(content, rootNode);
assertThat(success, is(true));
- assertThat(rootNode.getChildCount(), is(7));
+ assertThat(rootNode.getChildCount(), is(11));
// List<AstNode> theNodes = parser.nodeFactory().getChildrenForType(rootNode, TYPE_MISSING_TERMINATOR);
// assertThat(theNodes.size(), is(3));
@@ -834,4 +834,17 @@
assertThat(rootNode.getChildCount(), is(3));
}
+
+ @Test
+ public void shouldParseGrantStatements() {
+ printTest("shouldParseGrantStatements()");
+ String content = "GRANT SELECT ON TABLE purchaseOrders TO maria,harry;" + NEWLINE
+ + "GRANT UPDATE, USAGE ON TABLE purchaseOrders TO anita,zhi;" + NEWLINE
+ + "GRANT SELECT ON TABLE orders.bills to PUBLIC;" + NEWLINE
+ + "GRANT INSERT(a, b, c) ON TABLE purchaseOrders TO purchases_reader_role;";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(4));
+ }
}
Modified: trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParserTest.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParserTest.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlParserTest.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -4,17 +4,14 @@
import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_PROBLEM;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertThat;
-
import java.util.List;
-
import org.jboss.dna.sequencer.ddl.DdlParserTestHelper;
-import org.jboss.dna.sequencer.ddl.StandardDdlParser;
import org.jboss.dna.sequencer.ddl.node.AstNode;
import org.junit.Before;
import org.junit.Test;
-public class DerbyDdlParserTest extends DdlParserTestHelper{
- private StandardDdlParser parser;
+public class DerbyDdlParserTest extends DdlParserTestHelper {
+ private DerbyDdlParser parser;
private AstNode rootNode;
public static final String DDL_FILE_PATH = "src/test/resources/ddl/dialect/derby/";
@@ -29,27 +26,69 @@
rootNode = parser.nodeFactory().node("ddlRootNode");
}
-// @Test
-// public void shouldParseDerbyDDL() {
-// String content = getFileContent(DDL_FILE_PATH + "derby_test_create.ddl");
-//
-// List<Statement> stmts = parser.parse(content);
-//
-// assertEquals(744, stmts.size());
-//
-// System.out.println(" END PARSING. # Statements = " + stmts.size());
-//
-// }
+ @Test
+ public void shouldParseCreateFunctionWithDataTypeReturn() {
+ //setPrintToConsole(true);
+ //parser.setTestMode(isPrintToConsole());
+ printTest("shouldParseCreateFunctionWithDataTypeReturn()");
+ String content = "CREATE FUNCTION TO_DEGREES" + NEWLINE
+ + "( RADIANS DOUBLE )" + NEWLINE
+ + "RETURNS DOUBLE" + NEWLINE
+ + "PARAMETER STYLE JAVA" + NEWLINE
+ + "NO SQL LANGUAGE JAVA" + NEWLINE
+ + "EXTERNAL NAME 'java.lang.Math.toDegrees';";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+
+ }
+ @Test
+ public void shouldParseCreateFunctionWithTableTypeReturn() {
+ //setPrintToConsole(true);
+ //parser.setTestMode(isPrintToConsole());
+ printTest("shouldParseCreateFunctionWithTableTypeReturn()");
+ String content = "CREATE FUNCTION PROPERTY_FILE_READER" + NEWLINE
+ + "( FILENAME VARCHAR( 32672 ), FILESIZE INTEGER )" + NEWLINE
+ + "RETURNS TABLE (KEY_COL VARCHAR( 10 ), VALUE_COL VARCHAR( 1000 ))" + NEWLINE
+ + "LANGUAGE JAVA" + NEWLINE
+ + "PARAMETER STYLE DERBY_JDBC_RESULT_SET" + NEWLINE
+ + "NO SQL" + NEWLINE
+ + "EXTERNAL NAME 'vtis.example.PropertyFileVTI.propertyFileVTI';";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ }
+
@Test
public void shouldParseDropSchemaRestrict() {
- printTest("shouldParseDeclareGlobaTemporaryTable()");
+ printTest("shouldParseDropSchemaRestrict()");
String content = "DROP SCHEMA SAMP RESTRICT;";
boolean success = parser.parse(content, rootNode);
assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(1));
}
+ @Test
+ public void shouldParseCreateIndex() {
+ printTest("shouldParseCreateIndex()");
+ String content = "CREATE INDEX PAY_DESC ON SAMP.EMPLOYEE (SALARY DESC, UNIT);";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(1));
+ }
+
+ @Test
+ public void shouldParseLockTable() {
+ printTest("shouldParseLockTable()");
+ String content = "LOCK TABLE FlightAvailability IN EXCLUSIVE MODE;";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(1));
+ }
@Test
public void shouldParseDeclareGlobaTemporaryTable() {
@@ -59,8 +98,69 @@
boolean success = parser.parse(content, rootNode);
assertThat(true, is(success));
}
-
+
@Test
+ public void shouldParseRenameTable() {
+ printTest("shouldParseRenameTable()");
+ String content = "RENAME TABLE SAMP.EMP_ACT TO EMPLOYEE_ACT;";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(1));
+ }
+
+ @Test
+ public void shouldParseCreateSynonym() {
+ printTest("shouldParseCreateSynonym()");
+ String content = "CREATE SYNONYM SAMP.T1 FOR SAMP.TABLEWITHLONGNAME;";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(1));
+ }
+
+ @Test
+ public void shouldParseCreateTrigger() {
+ printTest("shouldParseCreateTrigger()");
+ String content = "CREATE TRIGGER FLIGHTSDELETE3" + NEWLINE
+ + "AFTER DELETE ON FLIGHTS" + NEWLINE
+ + "REFERENCING OLD AS OLD" + NEWLINE
+ + "FOR EACH ROW" + NEWLINE
+ + "DELETE FROM FLIGHTAVAILABILITY WHERE FLIGHT_ID = OLD.FLIGHT_ID;";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(1));
+ }
+
+ @Test
+ public void shouldParseCreateTrigger_2() {
+ printTest("shouldParseCreateTrigger_2()");
+ String content = "CREATE TRIGGER t1 NO CASCADE BEFORE UPDATE ON x" + NEWLINE
+ + "FOR EACH ROW MODE DB2SQL" + NEWLINE
+ + "values app.notifyEmail('Jerry', 'Table x is about to be updated');";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(1));
+ }
+
+ @Test
+ public void shouldParseGrantStatements() {
+ printTest("shouldParseGrantStatements()");
+ String content = "GRANT SELECT ON TABLE purchaseOrders TO maria,harry;" + NEWLINE
+ + "GRANT UPDATE, TRIGGER ON TABLE purchaseOrders TO anita,zhi;" + NEWLINE
+ + "GRANT SELECT ON TABLE orders.bills to PUBLIC;" + NEWLINE
+ + "GRANT EXECUTE ON PROCEDURE updatePurchases TO george;" + NEWLINE
+ + "GRANT purchases_reader_role TO george,maria;" + NEWLINE
+ + "GRANT SELECT ON TABLE purchaseOrders TO purchases_reader_role;";
+
+ boolean success = parser.parse(content, rootNode);
+ assertThat(true, is(success));
+ assertThat(rootNode.getChildCount(), is(6));
+ }
+
+ @Test
public void shouldParseAlterTableAlterColumnDefaultRealNumber() {
printTest("shouldParseAlterTableAlterColumnDefaultRealNumber()");
String content = "ALTER TABLE Employees ALTER COLUMN Salary DEFAULT 1000.0;";
@@ -82,6 +182,8 @@
@Test
public void shouldParseDerbyStatements() {
+ //setPrintToConsole(true);
+ //parser.setTestMode(isPrintToConsole());
printTest("shouldParseDerbyStatements()");
String content = getFileContent(DDL_FILE_PATH + "derby_test_statements.ddl");
@@ -91,6 +193,5 @@
List<AstNode> problems = parser.nodeFactory().getChildrenForType(rootNode, TYPE_PROBLEM);
int nStatements = rootNode.getChildCount() - problems.size();
assertEquals(64, nStatements);
-
}
}
\ No newline at end of file
Modified: trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParserTest.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParserTest.java 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/test/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlParserTest.java 2010-01-05 13:31:47 UTC (rev 1527)
@@ -23,16 +23,8 @@
*/
package org.jboss.dna.sequencer.ddl.dialect.postgres;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_SCHEMA_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_CREATE_TABLE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.TYPE_DROP_COLUMN_DEFINITION;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_FOREIGN_DATA_WRAPPER_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_ALTER_TABLE_STATEMENT_POSTGRES;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_COMMENT_ON_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_RULE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_CREATE_SEQUENCE_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_LISTEN_STATEMENT;
-import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.TYPE_RENAME_COLUMN;
+import static org.jboss.dna.sequencer.ddl.StandardDdlLexicon.*;
+import static org.jboss.dna.sequencer.ddl.dialect.postgres.PostgresDdlLexicon.*;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertTrue;
import org.jboss.dna.graph.JcrLexicon;
@@ -422,8 +414,59 @@
assertEquals(true, success);
assertEquals(2, rootNode.getChildCount());
}
+
+ @Test
+ public void shouldParseGrantOnTable() {
+ printTest("shouldParseGrantOnTable()");
+ String content = "GRANT UPDATE, TRIGGER ON TABLE t TO anita,zhi;";
+ boolean success = parser.parse(content, rootNode);
+
+ assertEquals(true, success);
+ assertEquals(1, rootNode.getChildCount());
+ }
+
@Test
+ public void shouldParseGrantOnMultipleTables() {
+ printTest("shouldParseGrantOnMultipleTables()");
+ String content = "GRANT UPDATE, TRIGGER ON TABLE t1, t2, t3 TO anita,zhi;";
+
+ boolean success = parser.parse(content, rootNode);
+
+ assertEquals(true, success);
+ assertEquals(3, rootNode.getChildCount());
+ }
+
+ @Test
+ public void shouldParseGrantExecuteOnFunction() {
+ printTest("shouldParseGrantExecuteOnFunction()");
+ String content = "GRANT EXECUTE ON FUNCTION divideByTwo(numerator int, IN demoninator int) TO george;";
+
+ boolean success = parser.parse(content, rootNode);
+
+ assertEquals(true, success);
+ assertEquals(1, rootNode.getChildCount());
+ AstNode childNode = rootNode.getChildren().get(0);
+ assertTrue(hasMixinType(childNode.getProperty(JcrLexicon.MIXIN_TYPES), TYPE_GRANT_ON_FUNCTION_STATEMENT));
+ }
+
+ @Test
+ public void shouldParseGrantExecuteAndUpdateOnMultipleFunctions() {
+ printTest("shouldParseGrantExecuteOnMultipleFunctions()");
+ String content = "GRANT EXECUTE, UPDATE ON FUNCTION cos(), sin(b double precision) TO peter;";
+
+ boolean success = parser.parse(content, rootNode);
+
+ assertEquals(true, success);
+ assertEquals(2, rootNode.getChildCount());
+ AstNode childNode = rootNode.getChild(0);
+ assertTrue(hasMixinType(childNode.getProperty(JcrLexicon.MIXIN_TYPES), TYPE_GRANT_ON_FUNCTION_STATEMENT));
+ assertEquals(3, childNode.getChildCount());
+ childNode = rootNode.getChild(1);
+ assertEquals(4, childNode.getChildCount());
+ }
+
+ @Test
public void shouldParsePostgresStatements_1() {
printTest("shouldParsePostgresStatements_1()");
String content = getFileContent(DDL_FILE_PATH + "postgres_test_statements_1.ddl");
Modified: trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/dialect/postgres/postgres_test_statements_4.ddl
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/dialect/postgres/postgres_test_statements_4.ddl 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/dialect/postgres/postgres_test_statements_4.ddl 2010-01-05 13:31:47 UTC (rev 1527)
@@ -43,7 +43,7 @@
GRANT SELECT ON TABLE s.v to PUBLIC;
-GRANT EXECUTE ON PROCEDURE p TO george;
+GRANT EXECUTE ON FUNCTION p(a int, b TEXT) TO george;
-- 10 STATEMENTS *******************************************************
GRANT purchases_reader_role TO george,maria;
Modified: trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/standardDdlTest.ddl
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/standardDdlTest.ddl 2010-01-05 13:24:33 UTC (rev 1526)
+++ trunk/extensions/dna-sequencer-ddl/src/test/resources/ddl/standardDdlTest.ddl 2010-01-05 13:31:47 UTC (rev 1527)
@@ -37,4 +37,13 @@
(HOTEL_ID INT NOT NULL, BOOKING_DATE DATE NOT NULL,
ROOMS_TAKEN INT DEFAULT 0, PRIMARY KEY (HOTEL_ID, BOOKING_DATE));
+GRANT SELECT ON TABLE purchaseOrders TO maria,harry;
+GRANT UPDATE, USAGE ON TABLE purchaseOrders TO anita,zhi;
+
+GRANT SELECT ON TABLE orders.bills to PUBLIC;
+
+GRANT INSERT(a, b, c) ON TABLE purchaseOrders TO purchases_reader_role;
+
+
+
14 years, 4 months
DNA SVN: r1526 - trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn.
by dna-commits@lists.jboss.org
Author: bcarothers
Date: 2010-01-05 08:24:33 -0500 (Tue, 05 Jan 2010)
New Revision: 1526
Added:
trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.properties
Removed:
trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.properties
Log:
Renamed properties file to correct case sensitivity issue
Deleted: trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.properties
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.properties 2010-01-05 12:50:08 UTC (rev 1525)
+++ trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.properties 2010-01-05 13:24:33 UTC (rev 1526)
@@ -1,59 +0,0 @@
-#
-# JBoss DNA (http://www.jboss.org/dna)
-# See the COPYRIGHT.txt file distributed with this work for information
-# regarding copyright ownership. Some portions may be licensed
-# to Red Hat, Inc. under one or more contributor license agreements.
-# See the AUTHORS.txt file in the distribution for a full listing of
-# individual contributors.
-#
-# JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
-# is licensed to you under the terms of the GNU Lesser General Public License as
-# published by the Free Software Foundation; either version 2.1 of
-# the License, or (at your option) any later version.
-#
-# JBoss DNA is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
-# Lesser General Public License for more details.
-#
-# You should have received a copy of the GNU Lesser General Public
-# License along with this software; if not, write to the Free
-# Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
-# 02110-1301 USA, or see the FSF site: http://www.fsf.org.
-#
-
-connectorName = SVN Connector
-nodeDoesNotExist = This node kind is missing at {0}
-nodeAlreadyExist = This node {0} already exist.
-nodeIsActuallyUnknow = This node kind is actually unknown {0}
-propertyIsRequired = The {0} property is required but has no value
-errorSerializingCachePolicyInSource = Error serializing a {0} instance owned by the {1} SVNRepositorySource
-sourceIsReadOnly = {0} is a read-only source; no updates are allowed
-sourceDoesNotSupportCreatingWorkspaces = {0} is a source that does not allow creating workspaces
-sourceDoesNotSupportCloningWorkspaces = {0} is a source that does not allow cloning workspaces
-sourceDoesNotSupportDeletingWorkspaces = {0} is a source that does not allow deleting workspaces
-connectingFailureOrUserAuthenticationProblem=failure occured while connecting to the repository {0} or the user's authentication failed
-workspaceDoesNotExist = The workspace "{0}" does not exist
-pathForDefaultWorkspaceDoesNotExist = The path "{0}" for the default workspace for the file system source "{1}" does not represent an existing directory
-pathForDefaultWorkspaceIsNotDirectory = The path "{0}" for the default workspace for the file system source "{1}" is actually a path to an existing file
-pathForDefaultWorkspaceCannotBeRead = The path "{0}" for the default workspace for the file system source "{1}" cannot be read
-pathForPredefinedWorkspaceDoesNotExist = The path "{0}" for the predefined workspace for the file system source "{1}" does not represent an existing directory
-pathForPredefinedWorkspaceIsNotDirectory = The path "{0}" for the predefined workspace for the file system source "{1}" is actually a path to an existing file
-pathForPredefinedWorkspaceCannotBeRead = The path "{0}" for the predefined workspace for the file system source "{1}" cannot be read
-sameNameSiblingsAreNotAllowed = {0}
-onlyTheDefaultNamespaceIsAllowed = {0} requires node names use the default namespace: {1}
-locationInRequestMustHavePath = {0} requires a path in the request: {1}
-unableToCreateWorkspaces = {0} does not allow creating new workspaces (request was to create "{1}")
-pathForRequestIsNotCorrect = The path "{0}" for the request is not correct.
-pathForRequestMustStartWithAForwardSlash = The path "{0}" for the request must start with a forward slash.
-unsupportedPrimaryType = Primary type "{3}" for path "{0}" in workspace "{1}" in {2} is not valid for the file system connector. Valid primary types are nt\:file, nt\:folder, nt\:resource, and dna\:resouce.
-invalidPropertyNames = Attempt to set or update invalid property names: {0}
-invalidNameForResource = Invalid node name "{3}" for node at path "{0}" in workspace "{1}" in {2}. The name of nodes with primary type nt:resource or dna:resource must be "jcr:content".
-invalidPathForResource = Invalid parent type for node at path "{0}" in workspace "{1}" in {2}. The parent node for nodes with primary type nt:resource or dna:resource must be of type nt:file.
-missingRequiredProperty = Missing required property "{3}" at path "{0}" in workspace "{1}" in {2}
-
-
-# Writable tests
-couldNotCreateFile =Error reading data at path "{0}" in workspace "{1}" in source "{2}": "{3}"
-couldNotReadData= Error reading data in workspace "{1}" "{0}" "{2}" "{3}"
-deleteFailed=Error deleting path {0} in workspace with source name {1}
\ No newline at end of file
Copied: trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.properties (from rev 1525, trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.properties)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.properties (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.properties 2010-01-05 13:24:33 UTC (rev 1526)
@@ -0,0 +1,59 @@
+#
+# JBoss DNA (http://www.jboss.org/dna)
+# See the COPYRIGHT.txt file distributed with this work for information
+# regarding copyright ownership. Some portions may be licensed
+# to Red Hat, Inc. under one or more contributor license agreements.
+# See the AUTHORS.txt file in the distribution for a full listing of
+# individual contributors.
+#
+# JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+# is licensed to you under the terms of the GNU Lesser General Public License as
+# published by the Free Software Foundation; either version 2.1 of
+# the License, or (at your option) any later version.
+#
+# JBoss DNA is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+# Lesser General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public
+# License along with this software; if not, write to the Free
+# Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+# 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+#
+
+connectorName = SVN Connector
+nodeDoesNotExist = This node kind is missing at {0}
+nodeAlreadyExist = This node {0} already exist.
+nodeIsActuallyUnknow = This node kind is actually unknown {0}
+propertyIsRequired = The {0} property is required but has no value
+errorSerializingCachePolicyInSource = Error serializing a {0} instance owned by the {1} SVNRepositorySource
+sourceIsReadOnly = {0} is a read-only source; no updates are allowed
+sourceDoesNotSupportCreatingWorkspaces = {0} is a source that does not allow creating workspaces
+sourceDoesNotSupportCloningWorkspaces = {0} is a source that does not allow cloning workspaces
+sourceDoesNotSupportDeletingWorkspaces = {0} is a source that does not allow deleting workspaces
+connectingFailureOrUserAuthenticationProblem=failure occured while connecting to the repository {0} or the user's authentication failed
+workspaceDoesNotExist = The workspace "{0}" does not exist
+pathForDefaultWorkspaceDoesNotExist = The path "{0}" for the default workspace for the file system source "{1}" does not represent an existing directory
+pathForDefaultWorkspaceIsNotDirectory = The path "{0}" for the default workspace for the file system source "{1}" is actually a path to an existing file
+pathForDefaultWorkspaceCannotBeRead = The path "{0}" for the default workspace for the file system source "{1}" cannot be read
+pathForPredefinedWorkspaceDoesNotExist = The path "{0}" for the predefined workspace for the file system source "{1}" does not represent an existing directory
+pathForPredefinedWorkspaceIsNotDirectory = The path "{0}" for the predefined workspace for the file system source "{1}" is actually a path to an existing file
+pathForPredefinedWorkspaceCannotBeRead = The path "{0}" for the predefined workspace for the file system source "{1}" cannot be read
+sameNameSiblingsAreNotAllowed = {0}
+onlyTheDefaultNamespaceIsAllowed = {0} requires node names use the default namespace: {1}
+locationInRequestMustHavePath = {0} requires a path in the request: {1}
+unableToCreateWorkspaces = {0} does not allow creating new workspaces (request was to create "{1}")
+pathForRequestIsNotCorrect = The path "{0}" for the request is not correct.
+pathForRequestMustStartWithAForwardSlash = The path "{0}" for the request must start with a forward slash.
+unsupportedPrimaryType = Primary type "{3}" for path "{0}" in workspace "{1}" in {2} is not valid for the file system connector. Valid primary types are nt\:file, nt\:folder, nt\:resource, and dna\:resouce.
+invalidPropertyNames = Attempt to set or update invalid property names: {0}
+invalidNameForResource = Invalid node name "{3}" for node at path "{0}" in workspace "{1}" in {2}. The name of nodes with primary type nt:resource or dna:resource must be "jcr:content".
+invalidPathForResource = Invalid parent type for node at path "{0}" in workspace "{1}" in {2}. The parent node for nodes with primary type nt:resource or dna:resource must be of type nt:file.
+missingRequiredProperty = Missing required property "{3}" at path "{0}" in workspace "{1}" in {2}
+
+
+# Writable tests
+couldNotCreateFile =Error reading data at path "{0}" in workspace "{1}" in source "{2}": "{3}"
+couldNotReadData= Error reading data in workspace "{1}" "{0}" "{2}" "{3}"
+deleteFailed=Error deleting path {0} in workspace with source name {1}
\ No newline at end of file
14 years, 4 months
DNA SVN: r1525 - in trunk: extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector and 4 other directories.
by dna-commits@lists.jboss.org
Author: bcarothers
Date: 2010-01-05 07:50:08 -0500 (Tue, 05 Jan 2010)
New Revision: 1525
Added:
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnActionExecutor.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepository.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryLexicon.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositorySource.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryUtil.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java
trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/
Removed:
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/
trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn2/
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/
Modified:
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnConnectorTestUtil.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnIntegrationTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18nTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorNoCreateWorkspaceTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorNotWritableTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorWritableTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositorySourceTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRespositoryConnectorReadableTest.java
Log:
DNA-519 SVN Connector integration test runs in a very long time
Applied patch (DNA-519_repackage_svn.patch) that moves everything back into the svn package. At this point, the SVN connector has now been fully converted to the path repository framework and can start to leverage the caching support provided therein. The next step is to write a CachePolicy and WorkspaceCache implementation for the SVN connector to improve read performance. After that, connection pooling can be considered.
There is also a need to implement all-or-nothing transaction support. That will be covered in a separate defect.
Modified: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -31,7 +31,7 @@
import javax.jcr.Property;
import javax.jcr.PropertyIterator;
import javax.jcr.Session;
-import org.jboss.dna.connector.svn2.SvnRepositorySource;
+import org.jboss.dna.connector.svn.SvnRepositorySource;
import org.jboss.dna.graph.SecurityContext;
import org.jboss.dna.jcr.JcrConfiguration;
import org.jboss.dna.jcr.JcrEngine;
Copied: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnActionExecutor.java (from rev 1524, trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnActionExecutor.java)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnActionExecutor.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnActionExecutor.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -0,0 +1,73 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn;
+
+import org.jboss.dna.connector.scm.ScmAction;
+import org.jboss.dna.connector.scm.ScmActionExecutor;
+import org.tmatesoft.svn.core.SVNErrorCode;
+import org.tmatesoft.svn.core.SVNErrorMessage;
+import org.tmatesoft.svn.core.SVNException;
+import org.tmatesoft.svn.core.io.ISVNEditor;
+import org.tmatesoft.svn.core.io.SVNRepository;
+
+/**
+ */
+public class SvnActionExecutor implements ScmActionExecutor {
+
+ private final SVNRepository repository;
+
+ /**
+ * @param repository
+ */
+ public SvnActionExecutor( SVNRepository repository ) {
+ this.repository = repository;
+ }
+
+ /**
+ * @return repository
+ */
+ public SVNRepository getRepository() {
+ return repository;
+ }
+
+ /**
+ * @param action
+ * @param message
+ * @throws SVNException
+ */
+ public void execute( ScmAction action,
+ String message ) throws SVNException {
+ ISVNEditor editor = this.repository.getCommitEditor(message, null);
+ editor.openRoot(-1);
+ try {
+ action.applyAction(editor);
+ } catch (Exception e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN, "This error is appeared: '{0}'", e.getMessage());
+ throw new SVNException(err, e);
+ }
+ editor.closeDir();
+ editor.closeEdit();
+
+ }
+}
Copied: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepository.java (from rev 1524, trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepository.java)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepository.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepository.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -0,0 +1,889 @@
+package org.jboss.dna.connector.svn;
+
+import java.io.ByteArrayOutputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.UUID;
+import org.jboss.dna.common.i18n.I18n;
+import org.jboss.dna.connector.scm.ScmAction;
+import org.jboss.dna.connector.svn.mgnt.AddDirectory;
+import org.jboss.dna.connector.svn.mgnt.AddFile;
+import org.jboss.dna.connector.svn.mgnt.DeleteEntry;
+import org.jboss.dna.connector.svn.mgnt.UpdateFile;
+import org.jboss.dna.graph.DnaIntLexicon;
+import org.jboss.dna.graph.DnaLexicon;
+import org.jboss.dna.graph.ExecutionContext;
+import org.jboss.dna.graph.JcrLexicon;
+import org.jboss.dna.graph.JcrNtLexicon;
+import org.jboss.dna.graph.NodeConflictBehavior;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.connector.path.AbstractWritablePathWorkspace;
+import org.jboss.dna.graph.connector.path.DefaultPathNode;
+import org.jboss.dna.graph.connector.path.PathNode;
+import org.jboss.dna.graph.connector.path.WritablePathRepository;
+import org.jboss.dna.graph.connector.path.WritablePathWorkspace;
+import org.jboss.dna.graph.connector.path.cache.WorkspaceCache;
+import org.jboss.dna.graph.property.Binary;
+import org.jboss.dna.graph.property.BinaryFactory;
+import org.jboss.dna.graph.property.DateTimeFactory;
+import org.jboss.dna.graph.property.Name;
+import org.jboss.dna.graph.property.NameFactory;
+import org.jboss.dna.graph.property.NamespaceRegistry;
+import org.jboss.dna.graph.property.Path;
+import org.jboss.dna.graph.property.PathFactory;
+import org.jboss.dna.graph.property.Property;
+import org.jboss.dna.graph.property.PropertyFactory;
+import org.jboss.dna.graph.property.Path.Segment;
+import org.jboss.dna.graph.request.InvalidRequestException;
+import org.tmatesoft.svn.core.SVNDirEntry;
+import org.tmatesoft.svn.core.SVNErrorCode;
+import org.tmatesoft.svn.core.SVNErrorMessage;
+import org.tmatesoft.svn.core.SVNException;
+import org.tmatesoft.svn.core.SVNNodeKind;
+import org.tmatesoft.svn.core.SVNProperties;
+import org.tmatesoft.svn.core.SVNProperty;
+import org.tmatesoft.svn.core.SVNURL;
+import org.tmatesoft.svn.core.auth.ISVNAuthenticationManager;
+import org.tmatesoft.svn.core.internal.io.dav.DAVRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.fs.FSRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.svn.SVNRepositoryFactoryImpl;
+import org.tmatesoft.svn.core.io.SVNRepository;
+import org.tmatesoft.svn.core.io.SVNRepositoryFactory;
+import org.tmatesoft.svn.core.wc.SVNWCUtil;
+
+public class SvnRepository extends WritablePathRepository {
+
+ private static final String DEFAULT_MIME_TYPE = "application/octet-stream";
+ private static final byte[] EMPTY_BYTE_ARRAY = new byte[0];
+
+ private final SvnRepositorySource source;
+
+ static {
+ // for DAV (over http and https)
+ DAVRepositoryFactory.setup();
+ // For File
+ FSRepositoryFactory.setup();
+ // for SVN (over svn and svn+ssh)
+ SVNRepositoryFactoryImpl.setup();
+ }
+
+ public SvnRepository( SvnRepositorySource source ) {
+ super(source);
+
+ this.source = source;
+ initialize();
+ }
+
+ @Override
+ protected void initialize() {
+ ExecutionContext context = source.getRepositoryContext().getExecutionContext();
+ for (String workspaceName : source.getPredefinedWorkspaceNames()) {
+ doCreateWorkspace(context, workspaceName);
+ }
+
+ String defaultWorkspaceName = source.getDirectoryForDefaultWorkspace();
+ if (defaultWorkspaceName != null && !workspaces.containsKey(defaultWorkspaceName)) {
+ doCreateWorkspace(context, defaultWorkspaceName);
+ }
+
+ }
+
+ public WorkspaceCache getCache( String workspaceName ) {
+ return source.getPathRepositoryCache().getCache(workspaceName);
+ }
+
+ /**
+ * Internal method that creates a workspace and adds it to the map of active workspaces without checking to see if the source
+ * allows creating workspaces. This is useful when setting up predefined workspaces.
+ *
+ * @param context the current execution context; may not be null
+ * @param name the name of the workspace to create; may not be null
+ * @return the newly created workspace; never null
+ */
+ private WritablePathWorkspace doCreateWorkspace( ExecutionContext context,
+ String name ) {
+ SvnWorkspace workspace = new SvnWorkspace(name, source.getRootNodeUuid());
+
+ workspaces.putIfAbsent(name, workspace);
+ return (WritablePathWorkspace)workspaces.get(name);
+
+ }
+
+ @Override
+ protected WritablePathWorkspace createWorkspace( ExecutionContext context,
+ String name ) {
+ if (!source.isCreatingWorkspacesAllowed()) {
+ String msg = SvnRepositoryConnectorI18n.unableToCreateWorkspaces.text(getSourceName(), name);
+ throw new InvalidRequestException(msg);
+ }
+
+ return doCreateWorkspace(context, name);
+ }
+
+ class SvnWorkspace extends AbstractWritablePathWorkspace {
+
+ /**
+ * Only certain properties are tolerated when writing content (dna:resource or jcr:resource) nodes. These properties are
+ * implicitly stored (primary type, data) or silently ignored (encoded, mimetype, last modified). The silently ignored
+ * properties must be accepted to stay compatible with the JCR specification.
+ */
+ private final Set<Name> ALLOWABLE_PROPERTIES_FOR_CONTENT = Collections.unmodifiableSet(new HashSet<Name>(
+ Arrays.asList(new Name[] {
+ JcrLexicon.PRIMARY_TYPE,
+ JcrLexicon.DATA,
+ JcrLexicon.ENCODED,
+ JcrLexicon.MIMETYPE,
+ JcrLexicon.LAST_MODIFIED,
+ JcrLexicon.UUID,
+ DnaIntLexicon.NODE_DEFINITON})));
+ /**
+ * Only certain properties are tolerated when writing files (nt:file) or folders (nt:folder) nodes. These properties are
+ * implicitly stored in the file or folder (primary type, created).
+ */
+ private final Set<Name> ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER = Collections.unmodifiableSet(new HashSet<Name>(
+ Arrays.asList(new Name[] {
+ JcrLexicon.PRIMARY_TYPE,
+ JcrLexicon.CREATED,
+ JcrLexicon.UUID,
+ DnaIntLexicon.NODE_DEFINITON})));
+
+ private final SVNRepository workspaceRoot;
+
+ public SvnWorkspace( String name,
+ UUID rootNodeUuid ) {
+ super(name, rootNodeUuid);
+
+ try {
+ workspaceRoot = SVNRepositoryFactory.create(SVNURL.parseURIDecoded(name));
+
+ ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(source.getUsername(),
+ source.getPassword());
+ workspaceRoot.setAuthenticationManager(authManager);
+ } catch (SVNException ex) {
+ throw new IllegalStateException(ex);
+ }
+ }
+
+ public Path getLowestExistingPath( Path path ) {
+ do {
+ path = path.getParent();
+
+ if (getNode(path) != null) {
+ return path;
+ }
+ } while (path != null);
+
+ assert false : "workspace root path was not a valid path";
+ return null;
+ }
+
+ public PathNode getNode( Path path ) {
+ WorkspaceCache cache = getCache(getName());
+
+ PathNode node = cache.get(path);
+ if (node != null) return node;
+
+ ExecutionContext context = source.getRepositoryContext().getExecutionContext();
+ List<Property> properties = new LinkedList<Property>();
+ List<Segment> children = new LinkedList<Segment>();
+
+ try {
+ boolean result = readNode(context, this.getName(), path, properties, children);
+ if (!result) return null;
+ } catch (SVNException ex) {
+ return null;
+ }
+
+ UUID uuid = path.isRoot() ? source.getRootNodeUuid() : null;
+ node = new DefaultPathNode(path, uuid, properties, children);
+
+ cache.set(node);
+ return node;
+ }
+
+ public PathNode createNode( ExecutionContext context,
+ PathNode parentNode,
+ Name name,
+ Map<Name, Property> properties,
+ NodeConflictBehavior conflictBehavior ) {
+
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+ NameFactory nameFactory = context.getValueFactories().getNameFactory();
+ PathFactory pathFactory = context.getValueFactories().getPathFactory();
+
+ // New name to commit into the svn repos workspace
+ String newName = name.getString(registry);
+
+ Property primaryTypeProp = properties.get(JcrLexicon.PRIMARY_TYPE);
+ Name primaryType = primaryTypeProp == null ? null : nameFactory.create(primaryTypeProp.getFirstValue());
+
+ Path parentPath = parentNode.getPath();
+ String parentPathAsString = parentPath.getString(registry);
+ Path newPath = pathFactory.create(parentPath, name);
+
+ String newChildPath = null;
+
+ // File
+ if (JcrNtLexicon.FILE.equals(primaryType)) {
+ ensureValidProperties(context, properties.values(), ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER);
+ // Parent node already exist
+ boolean skipWrite = false;
+
+ if (parentPath.isRoot()) {
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = newName;
+ } else {
+ newChildPath = "/" + newName;
+ }
+ } else {
+ newChildPath = newPath.getString(registry);
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = newChildPath.substring(1);
+ }
+ }
+
+ // check if the new name already exist
+ try {
+ if (SvnRepositoryUtil.exists(workspaceRoot, newChildPath)) {
+ if (conflictBehavior.equals(NodeConflictBehavior.APPEND)) {
+ I18n msg = SvnRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
+ throw new InvalidRequestException(msg.text("SVN Connector does not support Same Name Sibling"));
+ } else if (conflictBehavior.equals(NodeConflictBehavior.DO_NOT_REPLACE)) {
+ skipWrite = true;
+ }
+ }
+ } catch (SVNException e1) {
+ throw new RepositorySourceException(getSourceName(), e1.getMessage());
+ }
+
+ // Don't try to write if the node conflict behavior is DO_NOT_REPLACE
+ if (!skipWrite) {
+ // create a new, empty file
+ if (newChildPath != null) {
+ try {
+ String rootPath = null;
+ if (parentPath.isRoot()) {
+ rootPath = "";
+ } else {
+ rootPath = parentPathAsString;
+ }
+ newFile(rootPath, newName, EMPTY_BYTE_ARRAY, null, getName(), workspaceRoot);
+ } catch (SVNException e) {
+ I18n msg = SvnRepositoryConnectorI18n.couldNotCreateFile;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ e.getMessage()), e);
+ }
+ }
+ }
+ } else if (JcrNtLexicon.RESOURCE.equals(primaryType) || DnaLexicon.RESOURCE.equals(primaryType)) { // Resource
+ ensureValidProperties(context, properties.values(), ALLOWABLE_PROPERTIES_FOR_CONTENT);
+ if (parentPath.isRoot()) {
+ newChildPath = parentPathAsString;
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = parentPathAsString.substring(1);
+ }
+ } else {
+ newChildPath = parentPathAsString;
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = newChildPath.substring(1);
+ }
+ }
+
+ if (!JcrLexicon.CONTENT.equals(name)) {
+ I18n msg = SvnRepositoryConnectorI18n.invalidNameForResource;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ newName));
+ }
+
+ Property parentPrimaryType = parentNode.getProperty(JcrLexicon.PRIMARY_TYPE);
+ Name parentPrimaryTypeName = parentPrimaryType == null ? null : nameFactory.create(parentPrimaryType.getFirstValue());
+ if (!JcrNtLexicon.FILE.equals(parentPrimaryTypeName)) {
+ I18n msg = SvnRepositoryConnectorI18n.invalidPathForResource;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString, getName(), getSourceName()));
+ }
+
+ boolean skipWrite = false;
+ if (conflictBehavior.equals(NodeConflictBehavior.APPEND)) {
+ I18n msg = SvnRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
+ throw new InvalidRequestException(msg.text("SVN Connector does not support Same Name Sibling"));
+ } else if (conflictBehavior.equals(NodeConflictBehavior.DO_NOT_REPLACE)) {
+ // TODO check if the file already has content
+ skipWrite = true;
+ }
+
+ if (!skipWrite) {
+ Property dataProperty = properties.get(JcrLexicon.DATA);
+ if (dataProperty == null) {
+ I18n msg = SvnRepositoryConnectorI18n.missingRequiredProperty;
+ String dataPropName = JcrLexicon.DATA.getString(registry);
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ dataPropName));
+ }
+
+ BinaryFactory binaryFactory = context.getValueFactories().getBinaryFactory();
+ Binary binary = binaryFactory.create(properties.get(JcrLexicon.DATA).getFirstValue());
+ // get old data
+ ByteArrayOutputStream contents = new ByteArrayOutputStream();
+ SVNProperties svnProperties = new SVNProperties();
+ try {
+ workspaceRoot.getFile(newChildPath, -1, svnProperties, contents);
+ byte[] oldData = contents.toByteArray();
+
+ // modify the empty old data with the new resource
+ if (oldData != null) {
+ String pathToFile;
+ if (parentPath.isRoot()) {
+ pathToFile = "";
+ } else {
+ pathToFile = parentPath.getParent().getString(registry);
+ }
+ String fileName = parentPath.getLastSegment().getString(registry);
+
+ modifyFile(pathToFile, fileName, oldData, binary.getBytes(), null, getName(), workspaceRoot);
+ }
+ } catch (SVNException e) {
+ I18n msg = SvnRepositoryConnectorI18n.couldNotReadData;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ e.getMessage()), e);
+ }
+ }
+
+ } else if (JcrNtLexicon.FOLDER.equals(primaryType) || primaryType == null) { // Folder
+ ensureValidProperties(context, properties.values(), ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER);
+ try {
+ mkdir(parentPathAsString, newName, null, getName(), workspaceRoot);
+ } catch (SVNException e) {
+ I18n msg = SvnRepositoryConnectorI18n.couldNotCreateFile;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ e.getMessage()), e);
+ }
+ } else {
+ I18n msg = SvnRepositoryConnectorI18n.unsupportedPrimaryType;
+ throw new RepositorySourceException(getSourceName(), msg.text(primaryType.getString(registry),
+ parentPathAsString,
+ getName(),
+ getSourceName()));
+ }
+
+ PathNode node = getNode(newPath);
+
+ List<Segment> newChildren = new ArrayList<Segment>(parentNode.getChildSegments().size() + 1);
+ newChildren.addAll(parentNode.getChildSegments());
+ newChildren.add(node.getPath().getLastSegment());
+
+ WorkspaceCache cache = getCache(getName());
+ cache.set(new DefaultPathNode(parentNode.getPath(), parentNode.getUuid(), parentNode.getProperties(), newChildren));
+ cache.set(node);
+
+ return node;
+ }
+
+ /**
+ * Create a directory .
+ *
+ * @param rootDirPath - the root directory where the created directory will reside
+ * @param childDirPath - the name of the created directory.
+ * @param comment - comment for the creation.
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException - if during the creation, there is an error.
+ */
+ private void mkdir( String rootDirPath,
+ String childDirPath,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+
+ String tempParentPath = rootDirPath;
+ if (!source.getRepositoryRootUrl().equals(inWorkspace)) {
+ if (!tempParentPath.equals("/") && tempParentPath.startsWith("/")) {
+ tempParentPath = tempParentPath.substring(1);
+ } else if (tempParentPath.equals("/")) {
+ tempParentPath = "";
+ }
+ }
+ String checkPath = tempParentPath.length() == 0 ? childDirPath : tempParentPath + "/" + childDirPath;
+ SVNNodeKind nodeKind = null;
+ try {
+ nodeKind = currentRepository.checkPath(checkPath, -1);
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "May be a Connecting problem to the repository or a user's authentication failure: {0}",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+
+ if (nodeKind != null && nodeKind == SVNNodeKind.NONE) {
+ ScmAction addNodeAction = new AddDirectory(rootDirPath, childDirPath);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "Create a new file " + childDirPath : comment;
+ executor.execute(addNodeAction, comment);
+ } else {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "Node with name '{0}' can't be created",
+ childDirPath);
+ throw new SVNException(err);
+ }
+ }
+
+ /**
+ * Create a file.
+ *
+ * @param rootDirPath
+ * @param childFilePath
+ * @param content
+ * @param comment
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException
+ */
+ private void newFile( String rootDirPath,
+ String childFilePath,
+ byte[] content,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+
+ String tempParentPath = rootDirPath;
+ if (!source.getRepositoryRootUrl().equals(inWorkspace)) {
+ if (!tempParentPath.equals("/") && tempParentPath.startsWith("/")) {
+ tempParentPath = tempParentPath.substring(1);
+ }
+ }
+ String checkPath = tempParentPath + "/" + childFilePath;
+ SVNNodeKind nodeKind = null;
+ try {
+ nodeKind = currentRepository.checkPath(checkPath, -1);
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "May be a Connecting problem to the repository or a user's authentication failure: {0}",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+
+ if (nodeKind != null && nodeKind == SVNNodeKind.NONE) {
+ ScmAction addFileNodeAction = new AddFile(rootDirPath, childFilePath, content);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "Create a new file " + childFilePath : comment;
+ executor.execute(addFileNodeAction, comment);
+ } else {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "Item with name '{0}' can't be created (already exist)",
+ childFilePath);
+ throw new SVNException(err);
+ }
+ }
+
+ /**
+ * Modify a file
+ *
+ * @param rootPath
+ * @param fileName
+ * @param oldData
+ * @param newData
+ * @param comment
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException
+ */
+ private void modifyFile( String rootPath,
+ String fileName,
+ byte[] oldData,
+ byte[] newData,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+ assert rootPath != null;
+ assert fileName != null;
+ assert oldData != null;
+ assert inWorkspace != null;
+ assert currentRepository != null;
+
+ try {
+
+ if (!source.getRepositoryRootUrl().equals(inWorkspace)) {
+ if (rootPath.equals("/")) {
+ rootPath = "";
+ } else {
+ rootPath = rootPath.substring(1) + "/";
+ }
+ } else {
+ if (!rootPath.equals("/")) {
+ rootPath = rootPath + "/";
+ }
+ }
+ String path = rootPath + fileName;
+
+ SVNNodeKind nodeKind = currentRepository.checkPath(path, -1);
+ if (nodeKind == SVNNodeKind.NONE || nodeKind == SVNNodeKind.UNKNOWN) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.ENTRY_NOT_FOUND,
+ "Item with name '{0}' can't be found",
+ path);
+ throw new SVNException(err);
+ }
+
+ ScmAction modifyFileAction = new UpdateFile(rootPath, fileName, oldData, newData);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "modify the " + fileName : comment;
+ executor.execute(modifyFileAction, comment);
+
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN, "This error is appeared: " + e.getMessage());
+ throw new SVNException(err, e);
+ }
+
+ }
+
+ /**
+ * Delete entry from the repository
+ *
+ * @param path
+ * @param comment
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException
+ */
+ private void eraseEntry( String path,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+ assert path != null;
+ assert inWorkspace != null;
+ if (path.equals("/") || path.equals("")) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.BAD_URL, "The root directory cannot be deleted");
+ throw new SVNException(err);
+ }
+
+ try {
+ ScmAction deleteEntryAction = new DeleteEntry(path);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "Delete the " + path : comment;
+ executor.execute(deleteEntryAction, comment);
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "unknow error during delete action: {0)",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+ }
+
+ public boolean removeNode( ExecutionContext context,
+ Path nodePath ) {
+
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+
+ boolean isContentNode = !nodePath.isRoot() && JcrLexicon.CONTENT.equals(nodePath.getLastSegment().getName());
+ Path actualPath = isContentNode ? nodePath.getParent() : nodePath;
+
+ try {
+ SVNNodeKind kind = getNodeKind(context, actualPath, source.getRepositoryRootUrl());
+
+ if (kind == SVNNodeKind.NONE) {
+ return false;
+ }
+
+ if (isContentNode) {
+ String rootPath = actualPath.getParent().getString(registry);
+ String fileName = actualPath.getLastSegment().getString(registry);
+ modifyFile(rootPath, fileName, EMPTY_BYTE_ARRAY, EMPTY_BYTE_ARRAY, null, getName(), workspaceRoot);
+ } else {
+ eraseEntry(actualPath.getString(registry), null, getName(), workspaceRoot);
+ }
+ } catch (SVNException e) {
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.deleteFailed.text(nodePath, getSourceName()));
+ }
+
+ getCache(getName()).invalidate(nodePath);
+
+ return true;
+ }
+
+ public PathNode setProperties( ExecutionContext context,
+ Path nodePath,
+ Map<Name, Property> properties ) {
+ PathNode targetNode = getNode(nodePath);
+ if (targetNode == null) return null;
+
+ /*
+ * You can't really remove any properties from SVN nodes.
+ * You can clear the data of a dna:resource though
+ */
+
+ NameFactory nameFactory = context.getValueFactories().getNameFactory();
+ Property primaryTypeProperty = targetNode.getProperty(JcrLexicon.PRIMARY_TYPE);
+ Name primaryTypeName = primaryTypeProperty == null ? null : nameFactory.create(primaryTypeProperty.getFirstValue());
+ if (DnaLexicon.RESOURCE.equals(primaryTypeName)) {
+
+ for (Map.Entry<Name, Property> entry : properties.entrySet()) {
+ if (JcrLexicon.DATA.equals(entry.getKey())) {
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+ byte[] data;
+ if (entry.getValue() == null) {
+ data = EMPTY_BYTE_ARRAY;
+ } else {
+ BinaryFactory binaryFactory = context.getValueFactories().getBinaryFactory();
+ data = binaryFactory.create(entry.getValue().getFirstValue()).getBytes();
+
+ }
+
+ try {
+ Path actualPath = nodePath.getParent();
+ modifyFile(actualPath.getParent().getString(registry),
+ actualPath.getLastSegment().getString(registry),
+ EMPTY_BYTE_ARRAY,
+ data,
+ "",
+ getName(),
+ workspaceRoot);
+
+ PathNode node = getNode(nodePath);
+ getCache(getName()).set(node);
+
+ return node;
+ } catch (SVNException ex) {
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.deleteFailed.text(nodePath,
+ getSourceName()), ex);
+ }
+ }
+ }
+ }
+
+ return targetNode;
+ }
+
+ protected boolean readNode( ExecutionContext context,
+ String workspaceName,
+ Path requestedPath,
+ List<Property> properties,
+ List<Segment> children ) throws SVNException {
+ PathFactory pathFactory = context.getValueFactories().getPathFactory();
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+
+ if (requestedPath.isRoot()) {
+ // workspace root must be a directory
+ if (children != null) {
+ final Collection<SVNDirEntry> entries = SvnRepositoryUtil.getDir(workspaceRoot, "");
+ for (SVNDirEntry entry : entries) {
+ // All of the children of a directory will be another directory or a file, but never a "jcr:content" node
+ // ...
+ children.add(pathFactory.createSegment(entry.getName()));
+ }
+ }
+ // There are no properties on the root ...
+ } else {
+ // Generate the properties for this File object ...
+ PropertyFactory factory = context.getPropertyFactory();
+ DateTimeFactory dateFactory = context.getValueFactories().getDateFactory();
+
+ // Figure out the kind of node this represents ...
+ SVNNodeKind kind = getNodeKind(context, requestedPath, source.getRepositoryRootUrl());
+ if (kind == SVNNodeKind.NONE) {
+ // The node doesn't exist
+ return false;
+ }
+ if (kind == SVNNodeKind.DIR) {
+ String directoryPath = requestedPath.getString(registry);
+ if (!source.getRepositoryRootUrl().equals(workspaceName)) {
+ directoryPath = directoryPath.substring(1);
+ }
+ if (children != null) {
+ // Decide how to represent the children ...
+ Collection<SVNDirEntry> dirEntries = SvnRepositoryUtil.getDir(workspaceRoot, directoryPath);
+ for (SVNDirEntry entry : dirEntries) {
+ // All of the children of a directory will be another directory or a file,
+ // but never a "jcr:content" node ...
+ children.add(pathFactory.createSegment(entry.getName()));
+ }
+ }
+ if (properties != null) {
+ // Load the properties for this directory ......
+ properties.add(factory.create(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FOLDER));
+ SVNDirEntry entry = getEntryInfo(workspaceRoot, directoryPath);
+ if (entry != null) {
+ properties.add(factory.create(JcrLexicon.CREATED, dateFactory.create(entry.getDate())));
+ }
+ }
+ } else {
+ // It's not a directory, so must be a file; the only child of an nt:file is the "jcr:content" node
+ // ...
+ if (requestedPath.endsWith(JcrLexicon.CONTENT)) {
+ // There are never any children of these nodes, just properties ...
+ if (properties != null) {
+ String contentPath = requestedPath.getParent().getString(registry);
+ if (!source.getRepositoryRootUrl().equals(workspaceName)) {
+ contentPath = contentPath.substring(1);
+ }
+ SVNDirEntry entry = getEntryInfo(workspaceRoot, contentPath);
+ if (entry != null) {
+ // The request is to get properties of the "jcr:content" child node ...
+ // Do NOT use "nt:resource", since it extends "mix:referenceable". The JCR spec
+ // does not require that "jcr:content" is of type "nt:resource", but rather just
+ // suggests it. Therefore, we can use "dna:resource", which is identical to
+ // "nt:resource" except it does not extend "mix:referenceable"
+ properties.add(factory.create(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE));
+ properties.add(factory.create(JcrLexicon.LAST_MODIFIED, dateFactory.create(entry.getDate())));
+ }
+
+ ByteArrayOutputStream os = new ByteArrayOutputStream();
+ SVNProperties fileProperties = new SVNProperties();
+ getData(contentPath, fileProperties, os);
+ String mimeType = fileProperties.getStringValue(SVNProperty.MIME_TYPE);
+ if (mimeType == null) mimeType = DEFAULT_MIME_TYPE;
+ properties.add(factory.create(JcrLexicon.MIMETYPE, mimeType));
+
+ if (os.toByteArray().length > 0) {
+ // Now put the file's content into the "jcr:data" property ...
+ BinaryFactory binaryFactory = context.getValueFactories().getBinaryFactory();
+ properties.add(factory.create(JcrLexicon.DATA, binaryFactory.create(os.toByteArray())));
+ }
+ }
+ } else {
+ // Determine the corresponding file path for this object ...
+ String filePath = requestedPath.getString(registry);
+ if (!source.getRepositoryRootUrl().equals(workspaceName)) {
+ filePath = filePath.substring(1);
+ }
+ if (children != null) {
+ // Not a "jcr:content" child node but rather an nt:file node, so add the child ...
+ children.add(pathFactory.createSegment(JcrLexicon.CONTENT));
+ }
+ if (properties != null) {
+ // Now add the properties to "nt:file" ...
+ properties.add(factory.create(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE));
+ ByteArrayOutputStream os = new ByteArrayOutputStream();
+ SVNProperties fileProperties = new SVNProperties();
+ getData(filePath, fileProperties, os);
+ String created = fileProperties.getStringValue(SVNProperty.COMMITTED_DATE);
+ properties.add(factory.create(JcrLexicon.CREATED, dateFactory.create(created)));
+ }
+ }
+ }
+ }
+ return true;
+ }
+
+ /**
+ * Get some important informations of a path
+ *
+ * @param repos
+ * @param path - the path
+ * @return - the {@link SVNDirEntry}, or null if there is no such entry
+ */
+ protected SVNDirEntry getEntryInfo( SVNRepository repos,
+ String path ) {
+ assert path != null;
+ SVNDirEntry entry = null;
+ try {
+ entry = repos.info(path, -1);
+ } catch (SVNException e) {
+ throw new RepositorySourceException(
+ getSourceName(),
+ SvnRepositoryConnectorI18n.connectingFailureOrUserAuthenticationProblem.text(getSourceName()));
+ }
+ return entry;
+ }
+
+ /**
+ * Get the content of a file.
+ *
+ * @param path - the path to that file.
+ * @param properties - the properties of the file.
+ * @param os - the output stream where to store the content.
+ * @throws SVNException - throws if such path is not at that revision or in case of a connection problem.
+ */
+ protected void getData( String path,
+ SVNProperties properties,
+ OutputStream os ) throws SVNException {
+ workspaceRoot.getFile(path, -1, properties, os);
+
+ }
+
+ protected SVNNodeKind getNodeKind( ExecutionContext context,
+ Path path,
+ String repositoryRootUrl ) throws SVNException {
+ assert path != null;
+ assert repositoryRootUrl != null;
+
+ // See if the path is a "jcr:content" node ...
+ if (path.endsWith(JcrLexicon.CONTENT)) {
+ // We only want to use the parent path to find the actual file ...
+ path = path.getParent();
+ }
+ String pathAsString = path.getString(context.getNamespaceRegistry());
+ if (!repositoryRootUrl.equals(getName())) {
+ pathAsString = pathAsString.substring(1);
+ }
+
+ String absolutePath = pathAsString;
+ SVNNodeKind kind = workspaceRoot.checkPath(absolutePath, -1);
+ if (kind == SVNNodeKind.UNKNOWN) {
+ // node is unknown
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.nodeIsActuallyUnknow.text(pathAsString));
+ }
+ return kind;
+ }
+
+ protected SVNRepository getWorkspaceDirectory( String workspaceName ) {
+ if (workspaceName == null) workspaceName = source.getDirectoryForDefaultWorkspace();
+ SVNRepository repository = null;
+ SVNRepository repos = SvnRepositoryUtil.createRepository(workspaceName, source.getUsername(), source.getPassword());
+ if (SvnRepositoryUtil.isDirectory(repos, "")) {
+ repository = repos;
+ } else {
+ return null;
+ }
+ return repository;
+ }
+
+ /**
+ * Checks that the collection of {@code properties} only contains properties with allowable names.
+ *
+ * @param context
+ * @param properties
+ * @param validPropertyNames
+ * @throws RepositorySourceException if {@code properties} contains a
+ * @see #ALLOWABLE_PROPERTIES_FOR_CONTENT
+ * @see #ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER
+ */
+ protected void ensureValidProperties( ExecutionContext context,
+ Collection<Property> properties,
+ Set<Name> validPropertyNames ) {
+ List<String> invalidNames = new LinkedList<String>();
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+
+ for (Property property : properties) {
+ if (!validPropertyNames.contains(property.getName())) {
+ invalidNames.add(property.getName().getString(registry));
+ }
+ }
+
+ if (!invalidNames.isEmpty()) {
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.invalidPropertyNames.text(invalidNames.toString()));
+ }
+ }
+
+ }
+
+}
Copied: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.java (from rev 1524, trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18n.java)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18n.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -0,0 +1,89 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+* See the AUTHORS.txt file in the distribution for a full listing of
+* individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn;
+
+import java.util.Locale;
+import java.util.Set;
+import org.jboss.dna.common.i18n.I18n;
+
+/**
+ * The internationalized string constants for the <code>org.jboss.dna.connector.svn*</code> packages.
+ */
+public final class SvnRepositoryConnectorI18n {
+
+ public static I18n connectorName;
+ public static I18n nodeDoesNotExist;
+ public static I18n nodeIsActuallyUnknow;
+ public static I18n propertyIsRequired;
+ public static I18n errorSerializingCachePolicyInSource;
+ public static I18n locationInRequestMustHavePath;
+ public static I18n sourceIsReadOnly;
+ public static I18n sourceDoesNotSupportCreatingWorkspaces;
+ public static I18n sourceDoesNotSupportCloningWorkspaces;
+ public static I18n sourceDoesNotSupportDeletingWorkspaces;
+ public static I18n connectingFailureOrUserAuthenticationProblem;
+ public static I18n pathForPredefinedWorkspaceDoesNotExist;
+ public static I18n pathForPredefinedWorkspaceIsNotDirectory;
+ public static I18n pathForPredefinedWorkspaceCannotBeRead;
+ public static I18n workspaceDoesNotExist;
+ public static I18n pathForDefaultWorkspaceDoesNotExist;
+ public static I18n pathForDefaultWorkspaceIsNotDirectory;
+ public static I18n pathForDefaultWorkspaceCannotBeRead;
+ public static I18n sameNameSiblingsAreNotAllowed;
+ public static I18n onlyTheDefaultNamespaceIsAllowed;
+ public static I18n unableToCreateWorkspaces;
+ public static I18n pathForRequestIsNotCorrect;
+ public static I18n pathForRequestMustStartWithAForwardSlash;
+ public static I18n nodeAlreadyExist;
+ public static I18n unsupportedPrimaryType;
+ public static I18n invalidPropertyNames;
+ public static I18n invalidNameForResource;
+ public static I18n invalidPathForResource;
+ public static I18n missingRequiredProperty;
+ public static I18n couldNotCreateFile;
+ public static I18n couldNotReadData;
+ public static I18n deleteFailed;
+
+ static {
+ try {
+ I18n.initialize(SvnRepositoryConnectorI18n.class);
+ } catch (final Exception err) {
+ System.err.println(err);
+ }
+ }
+
+ public static Set<Locale> getLocalizationProblemLocales() {
+ return I18n.getLocalizationProblemLocales(SvnRepositoryConnectorI18n.class);
+ }
+
+ public static Set<String> getLocalizationProblems() {
+ return I18n.getLocalizationProblems(SvnRepositoryConnectorI18n.class);
+ }
+
+ public static Set<String> getLocalizationProblems( Locale locale ) {
+ return I18n.getLocalizationProblems(SvnRepositoryConnectorI18n.class, locale);
+ }
+
+
+}
Copied: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryLexicon.java (from rev 1524, trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryLexicon.java)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryLexicon.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryLexicon.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -0,0 +1,43 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn;
+
+import org.jboss.dna.connector.svn.SvnRepositorySource;
+import org.jboss.dna.graph.property.Name;
+import org.jboss.dna.graph.property.basic.BasicName;
+
+/**
+ * The namespace and property names used within a {@link SvnRepositorySource} to store internal information.
+ */
+public class SvnRepositoryLexicon {
+
+ public static class Namespace {
+ public static final String URI = "http://www.jboss.org/dna/connector/svn";
+ public static final String PREFIX = "dnasvn";
+ }
+
+ public static final Name CHILD_PATH_SEGMENT_LIST = new BasicName(Namespace.URI, "orderedChildNames");
+ public static final Name UUID = new BasicName(Namespace.URI, "uuid");
+
+}
Copied: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositorySource.java (from rev 1524, trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositorySource.java)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositorySource.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositorySource.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -0,0 +1,408 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn;
+
+import java.util.Hashtable;
+import java.util.List;
+import java.util.Map;
+import javax.naming.Context;
+import javax.naming.Name;
+import javax.naming.Reference;
+import javax.naming.StringRefAddr;
+import javax.naming.spi.ObjectFactory;
+import net.jcip.annotations.ThreadSafe;
+import org.jboss.dna.common.i18n.I18n;
+import org.jboss.dna.common.util.CheckArg;
+import org.jboss.dna.common.util.StringUtil;
+import org.jboss.dna.graph.connector.RepositoryConnection;
+import org.jboss.dna.graph.connector.RepositorySource;
+import org.jboss.dna.graph.connector.RepositorySourceCapabilities;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.connector.path.AbstractPathRepositorySource;
+import org.jboss.dna.graph.connector.path.PathRepositoryConnection;
+
+/**
+ * The {@link RepositorySource} for the connector that exposes an area of the local/remote svn repository as content in a
+ * repository. This source considers a workspace name to be the path to the directory on the repository's root directory location
+ * that represents the root of that workspace. New workspaces can be created, as long as the names represent valid paths to
+ * existing directories.
+ */
+@ThreadSafe
+public class SvnRepositorySource extends AbstractPathRepositorySource implements ObjectFactory {
+
+ /**
+ * The first serialized version of this source. Version {@value} .
+ */
+ private static final long serialVersionUID = 1L;
+
+ protected static final String SOURCE_NAME = "sourceName";
+ protected static final String SVN_REPOSITORY_ROOT_URL = "repositoryRootURL";
+ protected static final String SVN_USERNAME = "username";
+ protected static final String SVN_PASSWORD = "password";
+ protected static final String CACHE_TIME_TO_LIVE_IN_MILLISECONDS = "cacheTimeToLiveInMilliseconds";
+ protected static final String RETRY_LIMIT = "retryLimit";
+ protected static final String ROOT_NODE_UUID = "rootNodeUuid";
+ protected static final String DEFAULT_WORKSPACE = "defaultWorkspace";
+ protected static final String PREDEFINED_WORKSPACE_NAMES = "predefinedWorkspaceNames";
+ protected static final String ALLOW_CREATING_WORKSPACES = "allowCreatingWorkspaces";
+
+ /**
+ * This source supports events.
+ */
+ protected static final boolean SUPPORTS_EVENTS = true;
+ /**
+ * This source supports same-name-siblings.
+ */
+ protected static final boolean SUPPORTS_SAME_NAME_SIBLINGS = false;
+ /**
+ * This source does support creating workspaces.
+ */
+ protected static final boolean DEFAULT_SUPPORTS_CREATING_WORKSPACES = true;
+ /**
+ * This source supports udpates by default, but each instance may be configured to be read-only or updateable}.
+ */
+ public static final boolean DEFAULT_SUPPORTS_UPDATES = false;
+
+ /**
+ * This source supports creating references.
+ */
+ protected static final boolean SUPPORTS_REFERENCES = false;
+
+ private volatile String repositoryRootUrl;
+ private volatile String username;
+ private volatile String password;
+ private volatile String defaultWorkspace;
+ private volatile String[] predefinedWorkspaces = new String[] {};
+ private volatile RepositorySourceCapabilities capabilities = new RepositorySourceCapabilities(
+ SUPPORTS_SAME_NAME_SIBLINGS,
+ DEFAULT_SUPPORTS_UPDATES,
+ SUPPORTS_EVENTS,
+ DEFAULT_SUPPORTS_CREATING_WORKSPACES,
+ SUPPORTS_REFERENCES);
+
+ private transient SvnRepository repository;
+
+ /**
+ * Create a repository source instance.
+ */
+ public SvnRepositorySource() {
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.RepositorySource#getCapabilities()
+ */
+ public RepositorySourceCapabilities getCapabilities() {
+ return capabilities;
+ }
+
+ /**
+ * @return the url
+ */
+ public String getRepositoryRootUrl() {
+ return this.repositoryRootUrl;
+ }
+
+ /**
+ * Set the url for the subversion repository.
+ *
+ * @param url - the url location.
+ * @throws IllegalArgumentException If svn url is null or empty
+ */
+ public synchronized void setRepositoryRootUrl( String url ) {
+ CheckArg.isNotEmpty(url, "RepositoryRootUrl");
+ this.repositoryRootUrl = url;
+ }
+
+ public String getUsername() {
+ return this.username;
+ }
+
+ /**
+ * @param username
+ */
+ public synchronized void setUsername( String username ) {
+ this.username = username;
+ }
+
+ public String getPassword() {
+ return this.password;
+ }
+
+ /**
+ * @param password
+ */
+ public synchronized void setPassword( String password ) {
+ this.password = password;
+ }
+
+ /**
+ * Get whether this source supports updates.
+ *
+ * @return true if this source supports updates, or false if this source only supports reading content.
+ */
+ public boolean getSupportsUpdates() {
+ return capabilities.supportsUpdates();
+ }
+
+ /**
+ * Get the file system path to the existing directory that should be used for the default workspace. If the default is
+ * specified as a null String or is not a valid and resolvable path, this source will consider the default to be the current
+ * working directory of this virtual machine, as defined by the <code>new File(".")</code>.
+ *
+ * @return the file system path to the directory representing the default workspace, or null if the default should be the
+ * current working directory
+ */
+ public String getDirectoryForDefaultWorkspace() {
+ return defaultWorkspace;
+ }
+
+ public String getDefaultWorkspaceName() {
+ return defaultWorkspace;
+ }
+
+ /**
+ * Set the file system path to the existing directory that should be used for the default workspace. If the default is
+ * specified as a null String or is not a valid and resolvable path, this source will consider the default to be the current
+ * working directory of this virtual machine, as defined by the <code>new File(".")</code>.
+ *
+ * @param pathToDirectoryForDefaultWorkspace the valid and resolvable file system path to the directory representing the
+ * default workspace, or null if the current working directory should be used as the default workspace
+ */
+ public synchronized void setDirectoryForDefaultWorkspace( String pathToDirectoryForDefaultWorkspace ) {
+ this.defaultWorkspace = pathToDirectoryForDefaultWorkspace;
+ }
+
+ /**
+ * Gets the names of the workspaces that are available when this source is created. Each workspace name corresponds to a path
+ * to a directory on the file system.
+ *
+ * @return the names of the workspaces that this source starts with, or null if there are no such workspaces
+ * @see #setPredefinedWorkspaceNames(String[])
+ * @see #setCreatingWorkspacesAllowed(boolean)
+ */
+ public synchronized String[] getPredefinedWorkspaceNames() {
+ String[] copy = new String[predefinedWorkspaces.length];
+ System.arraycopy(predefinedWorkspaces, 0, copy, 0, predefinedWorkspaces.length);
+ return copy;
+ }
+
+ /**
+ * Sets the names of the workspaces that are available when this source is created. Each workspace name corresponds to a path
+ * to a directory on the file system.
+ *
+ * @param predefinedWorkspaceNames the names of the workspaces that this source should start with, or null if there are no
+ * such workspaces
+ * @see #setCreatingWorkspacesAllowed(boolean)
+ * @see #getPredefinedWorkspaceNames()
+ */
+ public synchronized void setPredefinedWorkspaceNames( String[] predefinedWorkspaceNames ) {
+ this.predefinedWorkspaces = predefinedWorkspaceNames;
+ }
+
+ /**
+ * Get whether this source allows workspaces to be created dynamically.
+ *
+ * @return true if this source allows workspaces to be created by clients, or false if the set of workspaces is fixed
+ * @see #setPredefinedWorkspaceNames(String[])
+ * @see #getPredefinedWorkspaceNames()
+ * @see #setCreatingWorkspacesAllowed(boolean)
+ */
+ public boolean isCreatingWorkspacesAllowed() {
+ return capabilities.supportsCreatingWorkspaces();
+ }
+
+ /**
+ * Set whether this source allows workspaces to be created dynamically.
+ *
+ * @param allowWorkspaceCreation true if this source allows workspaces to be created by clients, or false if the set of
+ * workspaces is fixed
+ * @see #setPredefinedWorkspaceNames(String[])
+ * @see #getPredefinedWorkspaceNames()
+ * @see #isCreatingWorkspacesAllowed()
+ */
+ public synchronized void setCreatingWorkspacesAllowed( boolean allowWorkspaceCreation ) {
+ capabilities = new RepositorySourceCapabilities(capabilities.supportsSameNameSiblings(), capabilities.supportsUpdates(),
+ capabilities.supportsEvents(), allowWorkspaceCreation,
+ capabilities.supportsReferences());
+ }
+
+ /**
+ * Get whether this source allows updates.
+ *
+ * @return true if this source allows updates by clients, or false if no updates are allowed
+ * @see #setUpdatesAllowed(boolean)
+ */
+ @Override
+ public boolean areUpdatesAllowed() {
+ return capabilities.supportsUpdates();
+ }
+
+ /**
+ * Set whether this source allows updates to data within workspaces
+ *
+ * @param allowUpdates true if this source allows updates to data within workspaces clients, or false if updates are not
+ * allowed.
+ * @see #areUpdatesAllowed()
+ */
+ public synchronized void setUpdatesAllowed( boolean allowUpdates ) {
+ capabilities = new RepositorySourceCapabilities(capabilities.supportsSameNameSiblings(), allowUpdates,
+ capabilities.supportsEvents(), capabilities.supportsCreatingWorkspaces(),
+ capabilities.supportsReferences());
+ }
+
+ /**
+ * {@inheritDoc}
+ */
+ @Override
+ public boolean equals( Object obj ) {
+ if (obj == this) return true;
+ if (obj instanceof SvnRepositorySource) {
+ SvnRepositorySource that = (SvnRepositorySource)obj;
+ if (this.getName() == null) {
+ if (that.getName() != null) return false;
+ } else {
+ if (!this.getName().equals(that.getName())) return false;
+ }
+ return true;
+ }
+ return false;
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see javax.naming.Referenceable#getReference()
+ */
+ public synchronized Reference getReference() {
+ String className = getClass().getName();
+ String factoryClassName = this.getClass().getName();
+ Reference ref = new Reference(className, factoryClassName, null);
+
+ if (getName() != null) {
+ ref.add(new StringRefAddr(SOURCE_NAME, getName()));
+ }
+ if (getRepositoryRootUrl() != null) {
+ ref.add(new StringRefAddr(SVN_REPOSITORY_ROOT_URL, getRepositoryRootUrl()));
+ }
+ if (getUsername() != null) {
+ ref.add(new StringRefAddr(SVN_USERNAME, getUsername()));
+ }
+ if (getPassword() != null) {
+ ref.add(new StringRefAddr(SVN_PASSWORD, getPassword()));
+ }
+ ref.add(new StringRefAddr(RETRY_LIMIT, Integer.toString(getRetryLimit())));
+ ref.add(new StringRefAddr(ROOT_NODE_UUID, rootNodeUuid.toString()));
+ ref.add(new StringRefAddr(DEFAULT_WORKSPACE, getDirectoryForDefaultWorkspace()));
+ ref.add(new StringRefAddr(ALLOW_CREATING_WORKSPACES, Boolean.toString(isCreatingWorkspacesAllowed())));
+ String[] workspaceNames = getPredefinedWorkspaceNames();
+ if (workspaceNames != null && workspaceNames.length != 0) {
+ ref.add(new StringRefAddr(PREDEFINED_WORKSPACE_NAMES, StringUtil.combineLines(workspaceNames)));
+ }
+ return ref;
+
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see javax.naming.spi.ObjectFactory#getObjectInstance(java.lang.Object, javax.naming.Name, javax.naming.Context,
+ * java.util.Hashtable)
+ */
+ public Object getObjectInstance( Object obj,
+ Name name,
+ Context nameCtx,
+ Hashtable<?, ?> environment ) throws Exception {
+ if (!(obj instanceof Reference)) return null;
+
+ Map<String, Object> values = valuesFrom((Reference)obj);
+
+ String sourceName = (String)values.get(SOURCE_NAME);
+ String repositoryRootUrl = (String)values.get(SVN_REPOSITORY_ROOT_URL);
+ String username = (String)values.get(SVN_USERNAME);
+ String password = (String)values.get(SVN_PASSWORD);
+ String retryLimit = (String)values.get(RETRY_LIMIT);
+ String rootNodeUuid = (String)values.get(ROOT_NODE_UUID);
+ String defaultWorkspace = (String)values.get(DEFAULT_WORKSPACE);
+ String createWorkspaces = (String)values.get(ALLOW_CREATING_WORKSPACES);
+
+ String combinedWorkspaceNames = (String)values.get(PREDEFINED_WORKSPACE_NAMES);
+ String[] workspaceNames = null;
+ if (combinedWorkspaceNames != null) {
+ List<String> paths = StringUtil.splitLines(combinedWorkspaceNames);
+ workspaceNames = paths.toArray(new String[paths.size()]);
+ }
+ // Create the source instance ...
+ SvnRepositorySource source = new SvnRepositorySource();
+ if (sourceName != null) source.setName(sourceName);
+ if (repositoryRootUrl != null && repositoryRootUrl.length() > 0) source.setRepositoryRootUrl(repositoryRootUrl);
+ if (username != null) source.setUsername(username);
+ if (password != null) source.setPassword(password);
+ if (retryLimit != null) source.setRetryLimit(Integer.parseInt(retryLimit));
+ if (rootNodeUuid != null) source.setRootNodeUuid(rootNodeUuid);
+ if (defaultWorkspace != null) source.setDirectoryForDefaultWorkspace(defaultWorkspace);
+ if (createWorkspaces != null) source.setCreatingWorkspacesAllowed(Boolean.parseBoolean(createWorkspaces));
+ if (workspaceNames != null && workspaceNames.length != 0) source.setPredefinedWorkspaceNames(workspaceNames);
+ return source;
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.RepositorySource#getConnection()
+ */
+ public synchronized RepositoryConnection getConnection() throws RepositorySourceException {
+
+ String sourceName = getName();
+ if (sourceName == null || sourceName.trim().length() == 0) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getName(), msg.text("name"));
+ }
+
+ String sourceUsername = getUsername();
+ if (sourceUsername == null || sourceUsername.trim().length() == 0) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getUsername(), msg.text("username"));
+ }
+
+ String sourcePassword = getPassword();
+ if (sourcePassword == null) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getPassword(), msg.text("password"));
+ }
+
+ String repositoryRootURL = getRepositoryRootUrl();
+ if (repositoryRootURL == null || repositoryRootURL.trim().length() == 0) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getRepositoryRootUrl(), msg.text("repositoryRootURL"));
+ }
+
+ if (this.repository == null) {
+ this.repository = new SvnRepository(this);
+ }
+
+ return new PathRepositoryConnection(this, this.repository);
+ }
+}
Copied: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryUtil.java (from rev 1524, trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryUtil.java)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryUtil.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SvnRepositoryUtil.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -0,0 +1,235 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn;
+
+import java.util.Collection;
+import java.util.Collections;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.request.InvalidWorkspaceException;
+import org.tmatesoft.svn.core.SVNDirEntry;
+import org.tmatesoft.svn.core.SVNErrorCode;
+import org.tmatesoft.svn.core.SVNErrorMessage;
+import org.tmatesoft.svn.core.SVNException;
+import org.tmatesoft.svn.core.SVNNodeKind;
+import org.tmatesoft.svn.core.SVNURL;
+import org.tmatesoft.svn.core.auth.ISVNAuthenticationManager;
+import org.tmatesoft.svn.core.internal.io.dav.DAVRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.fs.FSRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.svn.SVNRepositoryFactoryImpl;
+import org.tmatesoft.svn.core.io.SVNRepository;
+import org.tmatesoft.svn.core.io.SVNRepositoryFactory;
+import org.tmatesoft.svn.core.wc.SVNWCUtil;
+
+/**
+ */
+public class SvnRepositoryUtil {
+
+ /**
+ * @param url
+ * @param sourceName
+ * @return SVNURL
+ */
+ public static SVNURL createSVNURL( String url,
+ String sourceName ) {
+
+ SVNURL theUrl;
+ try {
+ theUrl = SVNURL.parseURIDecoded(url);
+ } catch (SVNException e) {
+ // protocol not supported by this connector
+ throw new RepositorySourceException(sourceName,
+ "Protocol is not supported by this connector or there is problem in the svn url");
+ }
+ return theUrl;
+ }
+
+ public static void setNewSVNRepositoryLocation( SVNRepository oldRespository,
+ String url,
+ boolean forceReconnect,
+ String sourceName ) {
+ try {
+ oldRespository.setLocation(createSVNURL(url, sourceName), forceReconnect);
+ } catch (SVNException e) {
+ throw new RepositorySourceException(sourceName, "the old url and a new one has got different protocols");
+ }
+ }
+
+ /**
+ * @param repository
+ * @param path
+ * @param revisionNumber
+ * @param sourceName
+ * @return SVNNodeKind
+ */
+ public static SVNNodeKind checkThePath( SVNRepository repository,
+ String path,
+ long revisionNumber,
+ String sourceName ) {
+ SVNNodeKind kind;
+ try {
+ kind = repository.checkPath(path, revisionNumber);
+
+ } catch (SVNException e) {
+ return null;
+ }
+ return kind;
+ }
+
+ /**
+ * Create a {@link SVNRepository} from a http protocol.
+ *
+ * @param url - the url of the repository.
+ * @param username - username credential.
+ * @param password - password credential
+ * @return {@link SVNRepository}.
+ */
+ public static SVNRepository createRepository( String url,
+ String username,
+ String password ) {
+ // for DAV (over http and https)
+ DAVRepositoryFactory.setup();
+ // For File
+ FSRepositoryFactory.setup();
+ // for SVN (over svn and svn+ssh)
+ SVNRepositoryFactoryImpl.setup();
+
+ // The factory knows how to create a DAVRepository
+ SVNRepository repository;
+ try {
+ repository = SVNRepositoryFactory.create(SVNURL.parseURIDecoded(url));
+
+ ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(username, password);
+ repository.setAuthenticationManager(authManager);
+ } catch (SVNException e) {
+ throw new InvalidWorkspaceException(SvnRepositoryConnectorI18n.workspaceDoesNotExist.text(e.getMessage()));
+ }
+ return repository;
+ }
+
+ /**
+ * Util to get the last segment from a path.
+ *
+ * @param repository
+ * @return last segment.
+ */
+ public static String getRepositoryWorspaceName( SVNRepository repository ) {
+ String[] segments = repository.getLocation().getPath().split("/");
+ return segments[segments.length - 1];
+ }
+
+ private SvnRepositoryUtil() {
+ // prvent construction
+ }
+
+ /**
+ * Check if the repository path exist.
+ *
+ * @param repos
+ * @return true if repository exist and false otherwise.
+ */
+ public static boolean exist( SVNRepository repos ) {
+ try {
+ SVNNodeKind kind = repos.checkPath("", -1);
+ if (kind == SVNNodeKind.NONE) {
+ return false;
+ }
+ return true;
+
+ } catch (SVNException e) {
+ return false;
+ }
+ }
+
+ /**
+ * Check if repository path is a directory.
+ *
+ * @param repos
+ * @param path
+ * @return true if repository path is a directory and false otherwise.
+ */
+ public static boolean isDirectory( SVNRepository repos,
+ String path ) {
+ try {
+ SVNNodeKind kind = repos.checkPath(path, -1);
+ if (kind == SVNNodeKind.DIR) {
+ return true;
+ }
+ } catch (SVNException e) {
+ return false;
+ }
+ return false;
+ }
+
+ /**
+ * @param repos
+ * @param path
+ * @return a collect of entry from directory path; never null
+ */
+ @SuppressWarnings( "unchecked" )
+ public static Collection<SVNDirEntry> getDir( SVNRepository repos,
+ String path ) {
+ try {
+ return repos.getDir(path, -1, null, (Collection<SVNDirEntry>)null);
+ } catch (SVNException e) {
+ return Collections.emptyList();
+ }
+ }
+
+ /**
+ * Check if the path is a file.
+ *
+ * @param repos
+ * @param path
+ * @return true if the path is a file and false otherwise.
+ */
+ public static boolean isFile( SVNRepository repos,
+ String path ) {
+ try {
+ SVNNodeKind kind = repos.checkPath(path, -1);
+ if (kind == SVNNodeKind.FILE) {
+ return true;
+ }
+ } catch (SVNException e) {
+ return false;
+ }
+ return false;
+ }
+
+ public static boolean exists( SVNRepository repository,
+ String path ) throws SVNException{
+ try {
+ if (repository.checkPath(path, -1) == SVNNodeKind.NONE) {
+ return false;
+ } else if (repository.checkPath(path, -1) == SVNNodeKind.UNKNOWN) {
+ return false;
+ }
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "unknow error during delete action: {0)",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+ return true;
+ }
+}
Copied: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java (from rev 1524, trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/package-info.java)
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -0,0 +1,29 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+/**
+ * The classes that make up the connector that accesses content from an SVN repository.
+ */
+
+package org.jboss.dna.connector.svn;
+
Copied: trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn (from rev 1524, trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn2)
Copied: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn (from rev 1524, trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2)
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnConnectorTestUtil.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnConnectorTestUtil.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnConnectorTestUtil.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -21,7 +21,7 @@
* Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
* 02110-1301 USA, or see the FSF site: http://www.fsf.org.
*/
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
import java.io.File;
import java.io.IOException;
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnIntegrationTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnIntegrationTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnIntegrationTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -21,12 +21,13 @@
* Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
* 02110-1301 USA, or see the FSF site: http://www.fsf.org.
*/
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
import static org.hamcrest.core.Is.is;
import static org.hamcrest.core.IsNull.notNullValue;
import static org.junit.Assert.assertThat;
import java.util.Map;
+import org.jboss.dna.connector.svn.SvnRepositorySource;
import org.jboss.dna.graph.ExecutionContext;
import org.jboss.dna.graph.Graph;
import org.jboss.dna.graph.Location;
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18nTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18nTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorI18nTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -21,9 +21,10 @@
* Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
* 02110-1301 USA, or see the FSF site: http://www.fsf.org.
*/
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
import org.jboss.dna.common.AbstractI18nTest;
+import org.jboss.dna.connector.svn.SvnRepositoryConnectorI18n;
/**
*/
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorNoCreateWorkspaceTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNoCreateWorkspaceTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorNoCreateWorkspaceTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -1,5 +1,6 @@
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
+import org.jboss.dna.connector.svn.SvnRepositorySource;
import org.jboss.dna.graph.Graph;
import org.jboss.dna.graph.connector.RepositorySource;
import org.jboss.dna.graph.connector.test.WorkspaceConnectorTest;
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorNotWritableTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNotWritableTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorNotWritableTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -1,5 +1,6 @@
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
+import org.jboss.dna.connector.svn.SvnRepositorySource;
import org.jboss.dna.graph.Graph;
import org.jboss.dna.graph.connector.RepositorySource;
import org.jboss.dna.graph.connector.test.NotWritableConnectorTest;
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorWritableTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorWritableTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositoryConnectorWritableTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -21,12 +21,13 @@
* Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
* 02110-1301 USA, or see the FSF site: http://www.fsf.org.
*/
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
import static org.hamcrest.core.Is.is;
import static org.hamcrest.core.IsNull.notNullValue;
import static org.junit.Assert.assertThat;
import java.io.ByteArrayOutputStream;
+import org.jboss.dna.connector.svn.SvnRepositorySource;
import org.jboss.dna.graph.DnaLexicon;
import org.jboss.dna.graph.Graph;
import org.jboss.dna.graph.JcrLexicon;
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositorySourceTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositorySourceTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRepositorySourceTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -21,7 +21,7 @@
* Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
* 02110-1301 USA, or see the FSF site: http://www.fsf.org.
*/
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
import static org.hamcrest.core.Is.is;
import static org.hamcrest.core.IsNull.notNullValue;
@@ -42,6 +42,7 @@
import javax.naming.RefAddr;
import javax.naming.Reference;
import javax.naming.spi.ObjectFactory;
+import org.jboss.dna.connector.svn.SvnRepositorySource;
import org.jboss.dna.graph.ExecutionContext;
import org.jboss.dna.graph.Subgraph;
import org.jboss.dna.graph.cache.BasicCachePolicy;
Modified: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRespositoryConnectorReadableTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRespositoryConnectorReadableTest.java 2010-01-05 12:43:25 UTC (rev 1524)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/SvnRespositoryConnectorReadableTest.java 2010-01-05 12:50:08 UTC (rev 1525)
@@ -21,12 +21,13 @@
* Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
* 02110-1301 USA, or see the FSF site: http://www.fsf.org.
*/
-package org.jboss.dna.connector.svn2;
+package org.jboss.dna.connector.svn;
import static org.hamcrest.core.Is.is;
import static org.hamcrest.core.IsNull.notNullValue;
import static org.junit.Assert.assertThat;
import java.util.List;
+import org.jboss.dna.connector.svn.SvnRepositorySource;
import org.jboss.dna.graph.Graph;
import org.jboss.dna.graph.JcrLexicon;
import org.jboss.dna.graph.JcrNtLexicon;
14 years, 4 months
DNA SVN: r1524 - in trunk: extensions/dna-connector-svn and 7 other directories.
by dna-commits@lists.jboss.org
Author: bcarothers
Date: 2010-01-05 07:43:25 -0500 (Tue, 05 Jan 2010)
New Revision: 1524
Added:
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnActionExecutor.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepository.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18n.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryLexicon.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositorySource.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryUtil.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/package-info.java
trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn2/
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnConnectorTestUtil.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnIntegrationTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18nTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNoCreateWorkspaceTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNotWritableTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorWritableTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositorySourceTest.java
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRespositoryConnectorReadableTest.java
Removed:
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/RepositoryAccessData.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNActionExecutor.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNProtocol.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnection.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryLexicon.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryRequestProcessor.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositorySource.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryUtil.java
trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java
trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/
trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn/
Modified:
trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java
trunk/extensions/dna-connector-svn/pom.xml
trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn2/SVNRepositoryConnectorI18n.properties
Log:
DNA-519
Applied patch that has passed Serge's review. This will be committed in two parts to work around case sensitivity issues on Windows. In the first part, this patch will be applied and all files will move into an svn2 package with the class name case corrected from SVN* to Svn*. A subsequent patch will move the classes back into the svn package.
Modified: trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java
===================================================================
--- trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/dna-integration-tests/src/test/java/org/jboss/dna/test/integration/svn/SvnAndJcrIntegrationTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -31,7 +31,7 @@
import javax.jcr.Property;
import javax.jcr.PropertyIterator;
import javax.jcr.Session;
-import org.jboss.dna.connector.svn.SVNRepositorySource;
+import org.jboss.dna.connector.svn2.SvnRepositorySource;
import org.jboss.dna.graph.SecurityContext;
import org.jboss.dna.jcr.JcrConfiguration;
import org.jboss.dna.jcr.JcrEngine;
@@ -56,15 +56,16 @@
final String repositoryName = "svnRepository";
final JcrConfiguration configuration = new JcrConfiguration();
configuration.repositorySource(svnRepositorySource)
- .usingClass(SVNRepositorySource.class)
+ .usingClass(SvnRepositorySource.class)
.setProperty("password", "")
.setProperty("username", "anonymous")
- .setProperty("repositoryRootURL", repositoryUrl)
+ .setProperty("repositoryRootUrl", repositoryUrl)
.setProperty("predefinedWorkspaceNames", predefinedWorkspaceNames)
.setProperty("directoryForDefaultWorkspace", predefinedWorkspaceNames[0])
.setProperty("creatingWorkspacesAllowed", false);
- configuration.repository(repositoryName).setSource(svnRepositorySource).setOption(Option.READ_DEPTH, "1");
+ configuration.repository(repositoryName).setSource(svnRepositorySource).setOption(Option.QUERY_EXECUTION_ENABLED, "false");
+
configuration.save();
this.engine = configuration.build();
this.engine.start();
@@ -107,6 +108,19 @@
}
}
+ @Test
+ public void shouldProvideAccessToJcrDataNodeUnderDeepFileNode() throws Exception {
+ String path = "extensions/dna-sequencer-text/src/test/resources/delimited/multiLineCommaDelimitedFile.csv/jcr:content";
+ System.out.println("Getting " + path + " and then walking its properties ...");
+ Node resourceNodeOfPomFile = this.session.getRootNode().getNode(path);
+ assertThat(resourceNodeOfPomFile, is(notNullValue()));
+
+ for (PropertyIterator iter = resourceNodeOfPomFile.getProperties(); iter.hasNext();) {
+ Property property = iter.nextProperty();
+ assertThat(property.getName(), is(notNullValue()));
+ }
+ }
+
protected class MyCustomSecurityContext implements SecurityContext {
/**
* {@inheritDoc}
Modified: trunk/extensions/dna-connector-svn/pom.xml
===================================================================
--- trunk/extensions/dna-connector-svn/pom.xml 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/pom.xml 2010-01-05 12:43:25 UTC (rev 1524)
@@ -51,6 +51,11 @@
<artifactId>svnkit</artifactId>
<version>1.3.0.5847</version>
</dependency>
+ <dependency>
+ <groupId>com.sun.jna</groupId>
+ <artifactId>jna</artifactId>
+ <version>3.0.9</version>
+ </dependency>
<!--
Testing (note the scope)
-->
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/RepositoryAccessData.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/RepositoryAccessData.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/RepositoryAccessData.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,49 +0,0 @@
-/**
- *
- */
-package org.jboss.dna.connector.svn;
-
-import net.jcip.annotations.ThreadSafe;
-
-@ThreadSafe
-public class RepositoryAccessData {
-
-
- private String repositoryRootUrl;
- private String username;
- private String password;
-
- /**
- * @param password
- * @param username
- * @param repositoryRootUrl
- */
- public RepositoryAccessData( String repositoryRootUrl,
- String username,
- String password ) {
- this.repositoryRootUrl = repositoryRootUrl;
- this.username = username;
- this.password = password;
- }
-
- /**
- * @return the repositoryRootUrl
- */
- public String getRepositoryRootUrl() {
- return repositoryRootUrl;
- }
-
- /**
- * @return the username
- */
- public String getUsername() {
- return username;
- }
-
- /**
- * @return the password
- */
- public String getPassword() {
- return password;
- }
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNActionExecutor.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNActionExecutor.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNActionExecutor.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,73 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-import org.jboss.dna.connector.scm.ScmAction;
-import org.jboss.dna.connector.scm.ScmActionExecutor;
-import org.tmatesoft.svn.core.SVNErrorCode;
-import org.tmatesoft.svn.core.SVNErrorMessage;
-import org.tmatesoft.svn.core.SVNException;
-import org.tmatesoft.svn.core.io.ISVNEditor;
-import org.tmatesoft.svn.core.io.SVNRepository;
-
-/**
- */
-public class SVNActionExecutor implements ScmActionExecutor {
-
- private SVNRepository repository;
-
- /**
- * @param repository
- */
- public SVNActionExecutor( SVNRepository repository ) {
- this.repository = repository;
- }
-
- /**
- * @return repository
- */
- public SVNRepository getRepository() {
- return repository;
- }
-
- /**
- * @param action
- * @param message
- * @throws SVNException
- */
- public void execute( ScmAction action,
- String message ) throws SVNException {
- ISVNEditor editor = this.repository.getCommitEditor(message, null);
- editor.openRoot(-1);
- try {
- action.applyAction(editor);
- } catch (Exception e) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN, "This error is appeared: '{0}'", e.getMessage());
- throw new SVNException(err);
- }
- editor.closeDir();
- editor.closeEdit();
-
- }
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNProtocol.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNProtocol.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNProtocol.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,44 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-/**
- */
-public enum SVNProtocol {
- FILE("file"),
- SVN("svn"),
- SVN_SSH("svn+ssh"),
- HTTP("http"),
- HTTPS("https");
-
- SVNProtocol( String value ) {
- this.value = value;
- }
-
- private final String value;
-
- public String value() {
- return value;
- }
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnection.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnection.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnection.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,186 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-import java.util.Set;
-import java.util.concurrent.TimeUnit;
-import javax.transaction.xa.XAResource;
-import org.jboss.dna.common.util.CheckArg;
-import org.jboss.dna.graph.ExecutionContext;
-import org.jboss.dna.graph.cache.CachePolicy;
-import org.jboss.dna.graph.connector.RepositoryConnection;
-import org.jboss.dna.graph.connector.RepositorySourceException;
-import org.jboss.dna.graph.request.Request;
-import org.jboss.dna.graph.request.processor.RequestProcessor;
-import org.tmatesoft.svn.core.SVNErrorCode;
-import org.tmatesoft.svn.core.SVNErrorMessage;
-import org.tmatesoft.svn.core.SVNException;
-import org.tmatesoft.svn.core.SVNNodeKind;
-import org.tmatesoft.svn.core.io.SVNRepository;
-
-/**
- * The defaultRepository connection to a SVN Repository instance.
- */
-public class SVNRepositoryConnection implements RepositoryConnection {
-
- private final String sourceName;
- private final CachePolicy cachePolicy;
- private final SVNRepository defaultWorkspace;
- private final boolean updatesAllowed;
- private final Set<String> availableWorkspaceNames;
- private final boolean creatingWorkspacesAllowed;
- private final RepositoryAccessData accessData;
-
- /**
- * default workspace must can be a root repository or any folders from the root directory. available workspace names must
- * consist of URLs from repository folders.
- *
- * @param sourceName
- * @param defaultWorkspace
- * @param availableWorkspaceNames
- * @param creatingWorkspacesAllowed
- * @param cachePolicy
- * @param updatesAllowed
- * @param accessData
- */
- public SVNRepositoryConnection( String sourceName,
- SVNRepository defaultWorkspace,
- Set<String> availableWorkspaceNames,
- boolean creatingWorkspacesAllowed,
- CachePolicy cachePolicy,
- boolean updatesAllowed,
- RepositoryAccessData accessData ) {
-
- CheckArg.isNotNull(defaultWorkspace, "defaultWorkspace");
- CheckArg.isNotEmpty(sourceName, "sourceName");
- assert availableWorkspaceNames != null;
- assert accessData != null;
-
- // Check if the default workspace is a folder.
- SVNNodeKind nodeKind = null;
- try {
- nodeKind = defaultWorkspace.checkPath("", -1);
- if (nodeKind == SVNNodeKind.NONE) {
- SVNErrorMessage error = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "No entry at URL ''{0}''",
- defaultWorkspace.getLocation().getPath());
- throw new SVNException(error);
- } else if (nodeKind == SVNNodeKind.UNKNOWN) {
- SVNErrorMessage error = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "Entry at URL ''{0}'' is a file while directory was expected",
- defaultWorkspace.getLocation().getPath());
- throw new SVNException(error);
- } else if (nodeKind == SVNNodeKind.FILE) {
- SVNErrorMessage error = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "Entry at URL ''{0}'' is a file while directory was expected",
- defaultWorkspace.getLocation().getPath());
- throw new SVNException(error);
- }
- } catch (SVNException e) {
- // deal with the exception
- throw new RuntimeException(e);
- }
-
- this.sourceName = sourceName;
- this.cachePolicy = cachePolicy;
- this.defaultWorkspace = defaultWorkspace;
- this.updatesAllowed = updatesAllowed;
- this.availableWorkspaceNames = availableWorkspaceNames;
- this.creatingWorkspacesAllowed = creatingWorkspacesAllowed;
- this.accessData = accessData;
- }
-
- SVNRepository getDefaultWorkspace() {
- return defaultWorkspace;
- }
-
- /**
- * {@inheritDoc}
- */
- public String getSourceName() {
- return sourceName;
- }
-
- /**
- * {@inheritDoc}
- */
- public CachePolicy getDefaultCachePolicy() {
- return cachePolicy;
- }
-
- /**
- * {@inheritDoc}
- */
- public XAResource getXAResource() {
- return null;
- }
-
- /**
- * {@inheritDoc}
- */
- public boolean ping( long time,
- TimeUnit unit ) {
- try {
- this.defaultWorkspace.getRepositoryRoot(true);
- } catch (SVNException e) {
- return false;
- }
- return true;
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositoryConnection#close()
- */
- public void close() {
- // do not care about.
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositoryConnection#execute(org.jboss.dna.graph.ExecutionContext,
- * org.jboss.dna.graph.request.Request)
- */
- public void execute( final ExecutionContext context,
- final Request request ) throws RepositorySourceException {
-
- RequestProcessor processor = new SVNRepositoryRequestProcessor(sourceName, defaultWorkspace, availableWorkspaceNames,
- creatingWorkspacesAllowed, context, updatesAllowed,
- accessData);
- try {
- processor.process(request);
- } finally {
- processor.close();
- }
- }
-
- /**
- * @return the accessData
- */
- public RepositoryAccessData getAccessData() {
- return accessData;
- }
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,89 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
-* See the AUTHORS.txt file in the distribution for a full listing of
-* individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-import java.util.Locale;
-import java.util.Set;
-import org.jboss.dna.common.i18n.I18n;
-
-/**
- * The internationalized string constants for the <code>org.jboss.dna.connector.svn*</code> packages.
- */
-public final class SVNRepositoryConnectorI18n {
-
- public static I18n connectorName;
- public static I18n nodeDoesNotExist;
- public static I18n nodeIsActuallyUnknow;
- public static I18n propertyIsRequired;
- public static I18n errorSerializingCachePolicyInSource;
- public static I18n locationInRequestMustHavePath;
- public static I18n sourceIsReadOnly;
- public static I18n sourceDoesNotSupportCreatingWorkspaces;
- public static I18n sourceDoesNotSupportCloningWorkspaces;
- public static I18n sourceDoesNotSupportDeletingWorkspaces;
- public static I18n connectingFailureOrUserAuthenticationProblem;
- public static I18n pathForPredefinedWorkspaceDoesNotExist;
- public static I18n pathForPredefinedWorkspaceIsNotDirectory;
- public static I18n pathForPredefinedWorkspaceCannotBeRead;
- public static I18n workspaceDoesNotExist;
- public static I18n pathForDefaultWorkspaceDoesNotExist;
- public static I18n pathForDefaultWorkspaceIsNotDirectory;
- public static I18n pathForDefaultWorkspaceCannotBeRead;
- public static I18n sameNameSiblingsAreNotAllowed;
- public static I18n onlyTheDefaultNamespaceIsAllowed;
- public static I18n unableToCreateWorkspaces;
- public static I18n pathForRequestIsNotCorrect;
- public static I18n pathForRequestMustStartWithAForwardSlash;
- public static I18n nodeAlreadyExist;
- public static I18n unsupportedPrimaryType;
- public static I18n invalidPropertyNames;
- public static I18n invalidNameForResource;
- public static I18n invalidPathForResource;
- public static I18n missingRequiredProperty;
- public static I18n couldNotCreateFile;
- public static I18n couldNotReadData;
- public static I18n deleteFailed;
-
- static {
- try {
- I18n.initialize(SVNRepositoryConnectorI18n.class);
- } catch (final Exception err) {
- System.err.println(err);
- }
- }
-
- public static Set<Locale> getLocalizationProblemLocales() {
- return I18n.getLocalizationProblemLocales(SVNRepositoryConnectorI18n.class);
- }
-
- public static Set<String> getLocalizationProblems() {
- return I18n.getLocalizationProblems(SVNRepositoryConnectorI18n.class);
- }
-
- public static Set<String> getLocalizationProblems( Locale locale ) {
- return I18n.getLocalizationProblems(SVNRepositoryConnectorI18n.class, locale);
- }
-
-
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryLexicon.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryLexicon.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryLexicon.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,42 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-import org.jboss.dna.graph.property.Name;
-import org.jboss.dna.graph.property.basic.BasicName;
-
-/**
- * The namespace and property names used within a {@link SVNRepositorySource} to store internal information.
- */
-public class SVNRepositoryLexicon {
-
- public static class Namespace {
- public static final String URI = "http://www.jboss.org/dna/connector/svn";
- public static final String PREFIX = "dnasvn";
- }
-
- public static final Name CHILD_PATH_SEGMENT_LIST = new BasicName(Namespace.URI, "orderedChildNames");
- public static final Name UUID = new BasicName(Namespace.URI, "uuid");
-
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryRequestProcessor.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryRequestProcessor.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryRequestProcessor.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,1379 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-import java.io.ByteArrayOutputStream;
-import java.io.OutputStream;
-import java.util.Arrays;
-import java.util.Collection;
-import java.util.Collections;
-import java.util.HashMap;
-import java.util.HashSet;
-import java.util.LinkedList;
-import java.util.List;
-import java.util.Map;
-import java.util.Set;
-import org.jboss.dna.common.i18n.I18n;
-import org.jboss.dna.common.util.Logger;
-import org.jboss.dna.connector.scm.ScmAction;
-import org.jboss.dna.connector.scm.ScmActionFactory;
-import org.jboss.dna.connector.svn.mgnt.AddDirectory;
-import org.jboss.dna.connector.svn.mgnt.AddFile;
-import org.jboss.dna.connector.svn.mgnt.DeleteEntry;
-import org.jboss.dna.connector.svn.mgnt.UpdateFile;
-import org.jboss.dna.graph.DnaIntLexicon;
-import org.jboss.dna.graph.DnaLexicon;
-import org.jboss.dna.graph.ExecutionContext;
-import org.jboss.dna.graph.JcrLexicon;
-import org.jboss.dna.graph.JcrNtLexicon;
-import org.jboss.dna.graph.Location;
-import org.jboss.dna.graph.NodeConflictBehavior;
-import org.jboss.dna.graph.connector.RepositorySourceException;
-import org.jboss.dna.graph.property.Binary;
-import org.jboss.dna.graph.property.BinaryFactory;
-import org.jboss.dna.graph.property.DateTimeFactory;
-import org.jboss.dna.graph.property.Name;
-import org.jboss.dna.graph.property.NameFactory;
-import org.jboss.dna.graph.property.NamespaceRegistry;
-import org.jboss.dna.graph.property.Path;
-import org.jboss.dna.graph.property.PathFactory;
-import org.jboss.dna.graph.property.PathNotFoundException;
-import org.jboss.dna.graph.property.Property;
-import org.jboss.dna.graph.property.PropertyFactory;
-import org.jboss.dna.graph.property.ValueFactory;
-import org.jboss.dna.graph.request.CloneBranchRequest;
-import org.jboss.dna.graph.request.CloneWorkspaceRequest;
-import org.jboss.dna.graph.request.CopyBranchRequest;
-import org.jboss.dna.graph.request.CreateNodeRequest;
-import org.jboss.dna.graph.request.CreateWorkspaceRequest;
-import org.jboss.dna.graph.request.DeleteBranchRequest;
-import org.jboss.dna.graph.request.DestroyWorkspaceRequest;
-import org.jboss.dna.graph.request.GetWorkspacesRequest;
-import org.jboss.dna.graph.request.InvalidRequestException;
-import org.jboss.dna.graph.request.InvalidWorkspaceException;
-import org.jboss.dna.graph.request.MoveBranchRequest;
-import org.jboss.dna.graph.request.ReadAllChildrenRequest;
-import org.jboss.dna.graph.request.ReadAllPropertiesRequest;
-import org.jboss.dna.graph.request.ReadNodeRequest;
-import org.jboss.dna.graph.request.RenameNodeRequest;
-import org.jboss.dna.graph.request.Request;
-import org.jboss.dna.graph.request.UpdatePropertiesRequest;
-import org.jboss.dna.graph.request.VerifyWorkspaceRequest;
-import org.jboss.dna.graph.request.processor.RequestProcessor;
-import org.tmatesoft.svn.core.SVNDirEntry;
-import org.tmatesoft.svn.core.SVNErrorCode;
-import org.tmatesoft.svn.core.SVNErrorMessage;
-import org.tmatesoft.svn.core.SVNException;
-import org.tmatesoft.svn.core.SVNNodeKind;
-import org.tmatesoft.svn.core.SVNProperties;
-import org.tmatesoft.svn.core.SVNProperty;
-import org.tmatesoft.svn.core.io.SVNRepository;
-
-/**
- * The {@link RequestProcessor} implementation for the file subversion repository connector. This is the class that does the bulk
- * of the work in the subversion repository connector, since it processes all requests.
- */
-public class SVNRepositoryRequestProcessor extends RequestProcessor implements ScmActionFactory {
-
- protected static final String BACK_SLASH = "/";
-
- private static final String DEFAULT_MIME_TYPE = "application/octet-stream";
-
- /**
- * Only certain properties are tolerated when writing content (dna:resource or jcr:resource) nodes. These properties are
- * implicitly stored (primary type, data) or silently ignored (encoded, mimetype, last modified). The silently ignored
- * properties must be accepted to stay compatible with the JCR specification.
- */
- private final Set<Name> ALLOWABLE_PROPERTIES_FOR_CONTENT = Collections.unmodifiableSet(new HashSet<Name>(
- Arrays.asList(new Name[] {
- JcrLexicon.PRIMARY_TYPE,
- JcrLexicon.DATA,
- JcrLexicon.ENCODED,
- JcrLexicon.MIMETYPE,
- JcrLexicon.LAST_MODIFIED,
- JcrLexicon.UUID,
- DnaIntLexicon.NODE_DEFINITON})));
- /**
- * Only certain properties are tolerated when writing files (nt:file) or folders (nt:folder) nodes. These properties are
- * implicitly stored in the file or folder (primary type, created).
- */
- private final Set<Name> ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER = Collections.unmodifiableSet(new HashSet<Name>(
- Arrays.asList(new Name[] {
- JcrLexicon.PRIMARY_TYPE,
- JcrLexicon.CREATED,
- JcrLexicon.UUID,
- DnaIntLexicon.NODE_DEFINITON})));
-
- private final String defaultNamespaceUri;
- private final boolean updatesAllowed;
- private SVNRepository defaultWorkspace;
- protected final Logger logger;
- private final Set<String> availableWorkspaceNames;
- private final boolean creatingWorkspacesAllowed;
- private final RepositoryAccessData accessData;
-
- /**
- * @param sourceName
- * @param context
- * @param defaultWorkspace
- * @param availableWorkspaceNames
- * @param creatingWorkspacesAllowed
- * @param updatesAllowed true if this connector supports updating the subversion repository, or false if the connector is read
- * only
- * @param accessData
- */
- protected SVNRepositoryRequestProcessor( String sourceName,
- SVNRepository defaultWorkspace,
- Set<String> availableWorkspaceNames,
- boolean creatingWorkspacesAllowed,
- ExecutionContext context,
- boolean updatesAllowed,
- RepositoryAccessData accessData ) {
- super(sourceName, context, null);
- assert defaultWorkspace != null;
- assert availableWorkspaceNames != null;
- this.defaultNamespaceUri = getExecutionContext().getNamespaceRegistry().getDefaultNamespaceUri();
- this.updatesAllowed = updatesAllowed;
- this.defaultWorkspace = defaultWorkspace;
- this.logger = getExecutionContext().getLogger(getClass());
- this.availableWorkspaceNames = availableWorkspaceNames;
- this.creatingWorkspacesAllowed = creatingWorkspacesAllowed;
- this.accessData = accessData;
- }
-
- protected void addProperty( List<Property> properties,
- PropertyFactory factory,
- Name propertyName,
- Object value ) {
- if (value != null) {
- properties.add(factory.create(propertyName, value));
- }
- }
-
- protected boolean readNode( String workspaceName,
- Location myLocation,
- List<Property> properties,
- List<Location> children,
- Request request ) {
-
- // Get the SVNRepository object that represents the workspace ...
- SVNRepository workspaceRoot = getWorkspaceDirectory(workspaceName);
- if (workspaceRoot == null) {
- request.setError(new InvalidWorkspaceException(SVNRepositoryConnectorI18n.workspaceDoesNotExist.text(workspaceName)));
- return false;
- }
- Path requestedPath = getPathFor(myLocation, request);
- checkThePath(requestedPath, request); // same-name-sibling indexes are not supported
-
- if (requestedPath.isRoot()) {
- // workspace root must be a directory
- if (children != null) {
- final Collection<SVNDirEntry> entries = SVNRepositoryUtil.getDir(workspaceRoot, "");
- for (SVNDirEntry entry : entries) {
- // All of the children of a directory will be another directory or a file, but never a "jcr:content" node ...
- String localName = entry.getName();
- Name childName = nameFactory().create(defaultNamespaceUri, localName);
- Path childPath = pathFactory().create(requestedPath, childName);
- children.add(Location.create(childPath));
- }
- }
- // There are no properties on the root ...
- } else {
- try {
- // Generate the properties for this File object ...
- PropertyFactory factory = getExecutionContext().getPropertyFactory();
- DateTimeFactory dateFactory = getExecutionContext().getValueFactories().getDateFactory();
-
- // Figure out the kind of node this represents ...
- SVNNodeKind kind = getNodeKind(workspaceRoot, requestedPath, accessData.getRepositoryRootUrl(), workspaceName);
- if (kind == SVNNodeKind.DIR) {
- String directoryPath = getPathAsString(requestedPath);
- if (!accessData.getRepositoryRootUrl().equals(workspaceName)) {
- directoryPath = directoryPath.substring(1);
- }
- if (children != null) {
- // Decide how to represent the children ...
- Collection<SVNDirEntry> dirEntries = SVNRepositoryUtil.getDir(workspaceRoot, directoryPath);
- for (SVNDirEntry entry : dirEntries) {
- // All of the children of a directory will be another directory or a file,
- // but never a "jcr:content" node ...
- String localName = entry.getName();
- Name childName = nameFactory().create(defaultNamespaceUri, localName);
- Path childPath = pathFactory().create(requestedPath, childName);
- children.add(Location.create(childPath));
- }
- }
- if (properties != null) {
- // Load the properties for this directory ......
- addProperty(properties, factory, JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FOLDER);
- SVNDirEntry entry = getEntryInfo(workspaceRoot, directoryPath);
- if (entry != null) {
- addProperty(properties, factory, JcrLexicon.LAST_MODIFIED, dateFactory.create(entry.getDate()));
- }
- }
- } else {
- // It's not a directory, so must be a file; the only child of an nt:file is the "jcr:content" node
- // ...
- if (requestedPath.endsWith(JcrLexicon.CONTENT)) {
- // There are never any children of these nodes, just properties ...
- if (properties != null) {
- String contentPath = getPathAsString(requestedPath.getParent());
- if (!accessData.getRepositoryRootUrl().equals(workspaceName)) {
- contentPath = contentPath.substring(1);
- }
- SVNDirEntry entry = getEntryInfo(workspaceRoot, contentPath);
- if (entry != null) {
- // The request is to get properties of the "jcr:content" child node ...
- // Do NOT use "nt:resource", since it extends "mix:referenceable". The JCR spec
- // does not require that "jcr:content" is of type "nt:resource", but rather just
- // suggests it. Therefore, we can use "dna:resource", which is identical to
- // "nt:resource" except it does not extend "mix:referenceable"
- addProperty(properties, factory, JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE);
- addProperty(properties, factory, JcrLexicon.LAST_MODIFIED, dateFactory.create(entry.getDate()));
- }
-
- ByteArrayOutputStream os = new ByteArrayOutputStream();
- SVNProperties fileProperties = new SVNProperties();
- getData(contentPath, fileProperties, os);
- String mimeType = fileProperties.getStringValue(SVNProperty.MIME_TYPE);
- if (mimeType == null) mimeType = DEFAULT_MIME_TYPE;
- addProperty(properties, factory, JcrLexicon.MIMETYPE, mimeType);
-
- if (os.toByteArray().length > 0) {
- // Now put the file's content into the "jcr:data" property ...
- BinaryFactory binaryFactory = getExecutionContext().getValueFactories().getBinaryFactory();
- addProperty(properties, factory, JcrLexicon.DATA, binaryFactory.create(os.toByteArray()));
- }
- }
- } else {
- // Determine the corresponding file path for this object ...
- String filePath = getPathAsString(requestedPath);
- if (!accessData.getRepositoryRootUrl().equals(workspaceName)) {
- filePath = filePath.substring(1);
- }
- if (children != null) {
- // Not a "jcr:content" child node but rather an nt:file node, so add the child ...
- Path contentPath = pathFactory().create(requestedPath, JcrLexicon.CONTENT);
- children.add(Location.create(contentPath));
- }
- if (properties != null) {
- // Now add the properties to "nt:file" ...
- addProperty(properties, factory, JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE);
- ByteArrayOutputStream os = new ByteArrayOutputStream();
- SVNProperties fileProperties = new SVNProperties();
- getData(filePath, fileProperties, os);
- String created = fileProperties.getStringValue(SVNProperty.COMMITTED_DATE);
- addProperty(properties, factory, JcrLexicon.CREATED, dateFactory.create(created));
- }
- }
- }
- } catch (SVNException e) {
- request.setError(e);
- }
- }
- return true;
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.ReadNodeRequest)
- */
- @Override
- public void process( ReadNodeRequest request ) {
- logger.trace(request.toString());
- List<Location> children = new LinkedList<Location>();
- List<Property> properties = new LinkedList<Property>();
- if (readNode(request.inWorkspace(), request.at(), properties, children, request)) {
- request.addChildren(children);
- request.addProperties(properties);
- request.setActualLocationOfNode(request.at());
- setCacheableInfo(request);
- }
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.ReadAllChildrenRequest)
- */
- @Override
- public void process( ReadAllChildrenRequest request ) {
- logger.trace(request.toString());
- List<Location> children = new LinkedList<Location>();
- if (readNode(request.inWorkspace(), request.of(), null, children, request)) {
- request.addChildren(children);
- request.setActualLocationOfNode(request.of());
- setCacheableInfo(request);
- }
-
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.ReadAllPropertiesRequest)
- */
- @Override
- public void process( ReadAllPropertiesRequest request ) {
- logger.trace(request.toString());
- List<Property> properties = new LinkedList<Property>();
- if (readNode(request.inWorkspace(), request.at(), properties, null, request)) {
- request.addProperties(properties);
- request.setActualLocationOfNode(request.at());
- setCacheableInfo(request);
- }
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.CreateNodeRequest)
- */
- @Override
- public void process( CreateNodeRequest request ) {
- logger.trace(request.toString());
- if (!updatesAllowed(request)) return;
-
- // continue
- Path parentPath = getPathFor(request.under(), request);
- if (parentPath == null) return;
-
- // svn connector does not support same name sibling
- sameNameSiblingIsNotSupported(parentPath);
-
- SVNRepository workspaceRoot = getWorkspaceDirectory(request.inWorkspace());
- assert workspaceRoot != null;
-
- SVNNodeKind parent = getSVNNodeKindFor(workspaceRoot, parentPath, request.under(), request.inWorkspace(), request);
- if (parent == null) {
- return;
- }
-
- NamespaceRegistry registry = getExecutionContext().getNamespaceRegistry();
- // New name to commit into the svn repos workspace
- String newName = request.named().getString(registry);
-
- // Collect all the properties of the node in a hash map
- Map<Name, Property> properties = new HashMap<Name, Property>(request.properties().size());
- for (Property property : request.properties()) {
- properties.put(property.getName(), property);
- }
-
- Property primaryTypeProp = properties.get(JcrLexicon.PRIMARY_TYPE);
- Name primaryType = primaryTypeProp == null ? null : nameFactory().create(primaryTypeProp.getFirstValue());
-
- Path newPath = pathFactory().create(parentPath, request.named());
- Location actualLocation = Location.create(newPath);
-
- String newChildPath = null;
-
- // File
- if (JcrNtLexicon.FILE.equals(primaryType)) {
- ensureValidProperties(request.properties(), ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER);
- // Parent node already exist
- boolean skipWrite = false;
-
- if (request.under().getPath().isRoot()) {
- if (!accessData.getRepositoryRootUrl().equals(request.inWorkspace())) {
- newChildPath = newName;
- } else {
- newChildPath = "/" + newName;
- }
- } else {
- newChildPath = getPathAsString(request.under().getPath()) + "/" + newName;
- if (!accessData.getRepositoryRootUrl().equals(request.inWorkspace())) {
- newChildPath = newChildPath.substring(1);
- }
- }
-
- // check if the new name already exist
- try {
- if (SVNRepositoryUtil.exists(workspaceRoot, newChildPath)) {
- if (request.conflictBehavior().equals(NodeConflictBehavior.APPEND)) {
- I18n msg = SVNRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
- throw new InvalidRequestException(msg.text("SVN Connector does not support Same Name Sibling"));
- } else if (request.conflictBehavior().equals(NodeConflictBehavior.DO_NOT_REPLACE)) {
- skipWrite = true;
- }
- }
- } catch (SVNException e1) {
- throw new RepositorySourceException(getSourceName(), e1.getMessage());
- }
-
- // Don't try to write if the node conflict behavior is DO_NOT_REPLACE
- if (!skipWrite) {
- // create a new, empty file
- if (newChildPath != null) {
- try {
- String rootPath = null;
- if (request.under().getPath().isRoot()) {
- rootPath = "";
- } else {
- rootPath = getPathAsString(request.under().getPath());
- }
- newFile(rootPath, newName, "".getBytes(), null, request.inWorkspace(), workspaceRoot);
- } catch (SVNException e) {
- I18n msg = SVNRepositoryConnectorI18n.couldNotCreateFile;
- request.setError(new RepositorySourceException(getSourceName(),
- msg.text(getPathAsString(request.under().getPath()),
- request.inWorkspace(),
- getSourceName(),
- e.getMessage()), e));
- return;
- }
- }
- }
- } else if (JcrNtLexicon.RESOURCE.equals(primaryType) || DnaLexicon.RESOURCE.equals(primaryType)) { // Resource
- ensureValidProperties(request.properties(), ALLOWABLE_PROPERTIES_FOR_CONTENT);
- if (request.under().getPath().isRoot()) {
- newChildPath = getPathAsString(parentPath);
- if (!accessData.getRepositoryRootUrl().equals(request.inWorkspace())) {
- newChildPath = getPathAsString(parentPath).substring(1);
- }
- } else {
- newChildPath = getPathAsString(parentPath);
- if (!accessData.getRepositoryRootUrl().equals(request.inWorkspace())) {
- newChildPath = newChildPath.substring(1);
- }
- }
-
- if (!JcrLexicon.CONTENT.equals(request.named())) {
- I18n msg = SVNRepositoryConnectorI18n.invalidNameForResource;
- String nodeName = request.named().getString(registry);
- request.setError(new RepositorySourceException(getSourceName(),
- msg.text(getPathAsString(request.under().getPath()),
- request.inWorkspace(),
- getSourceName(),
- nodeName)));
- return;
- }
-
- if (parent != SVNNodeKind.FILE) {
- I18n msg = SVNRepositoryConnectorI18n.invalidPathForResource;
- request.setError(new RepositorySourceException(getSourceName(),
- msg.text(getPathAsString(request.under().getPath()),
- request.inWorkspace(),
- getSourceName())));
- return;
- }
-
- boolean skipWrite = false;
- if (parent != SVNNodeKind.NONE || parent != SVNNodeKind.UNKNOWN) {
- if (request.conflictBehavior().equals(NodeConflictBehavior.APPEND)) {
- I18n msg = SVNRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
- throw new InvalidRequestException(msg.text("SVN Connector does not support Same Name Sibling"));
- } else if (request.conflictBehavior().equals(NodeConflictBehavior.DO_NOT_REPLACE)) {
- // TODO check if the file already has content
- skipWrite = true;
- }
- }
-
- if (!skipWrite) {
- Property dataProperty = properties.get(JcrLexicon.DATA);
- if (dataProperty == null) {
- I18n msg = SVNRepositoryConnectorI18n.missingRequiredProperty;
- String dataPropName = JcrLexicon.DATA.getString(registry);
- request.setError(new RepositorySourceException(getSourceName(),
- msg.text(getPathAsString(request.under().getPath()),
- request.inWorkspace(),
- getSourceName(),
- dataPropName)));
- return;
- }
-
- BinaryFactory binaryFactory = getExecutionContext().getValueFactories().getBinaryFactory();
- Binary binary = binaryFactory.create(properties.get(JcrLexicon.DATA).getFirstValue());
- // get old data
- ByteArrayOutputStream contents = new ByteArrayOutputStream();
- SVNProperties svnProperties = new SVNProperties();
- try {
- workspaceRoot.getFile(newChildPath, -1, svnProperties, contents);
- byte[] oldData = contents.toByteArray();
- // modify the empty old data with the new resource
- if (oldData != null) {
- String rootPath = null;
- String fileName = null;
-
- Path p = request.under().getPath();
- rootPath = getPathAsString(p.getAncestor(1));
- fileName = p.getLastSegment().getString(registry);
-
- if (request.under().getPath().isRoot()) {
- rootPath = "";
- }
-
- modifyFile(rootPath, fileName, oldData, binary.getBytes(), null, request.inWorkspace(), workspaceRoot);
- }
- } catch (SVNException e) {
- I18n msg = SVNRepositoryConnectorI18n.couldNotReadData;
- request.setError(new RepositorySourceException(getSourceName(),
- msg.text(getPathAsString(request.under().getPath()),
- request.inWorkspace(),
- getSourceName(),
- e.getMessage()), e));
- return;
- }
- }
-
- } else if (JcrNtLexicon.FOLDER.equals(primaryType) || primaryType == null) { // Folder
- ensureValidProperties(request.properties(), ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER);
- try {
- String rootDirPath = getPathAsString(request.under().getPath());
- mkdir(rootDirPath, newName, null, request.inWorkspace(), workspaceRoot);
- } catch (SVNException e) {
- I18n msg = SVNRepositoryConnectorI18n.couldNotCreateFile;
- request.setError(new RepositorySourceException(getSourceName(),
- msg.text(getPathAsString(request.under().getPath()),
- request.inWorkspace(),
- getSourceName(),
- e.getMessage()), e));
- return;
- }
- } else {
- I18n msg = SVNRepositoryConnectorI18n.unsupportedPrimaryType;
- request.setError(new RepositorySourceException(getSourceName(), msg.text(primaryType.getString(registry),
- getPathAsString(request.under().getPath()),
- request.inWorkspace(),
- getSourceName())));
- return;
- }
-
- request.setActualLocationOfNode(actualLocation);
- }
-
- /**
- * @param workspaceRoot
- * @param path
- * @param location
- * @param inWorkspace
- * @param request
- * @return a svn node kind
- */
- protected SVNNodeKind getSVNNodeKindFor( SVNRepository workspaceRoot,
- Path path,
- Location location,
- String inWorkspace,
- Request request ) {
- assert path != null;
- assert location != null;
- assert request != null;
-
- SVNNodeKind rootNode = SVNRepositoryUtil.checkThePath(workspaceRoot, "", -1, getSourceName());
-
- if (rootNode != SVNNodeKind.DIR) return null;
-
- if (path.isRoot()) {
- return rootNode;
- }
-
- // See if the path is a "jcr:content" node ...
- if (path.getLastSegment().getName().equals(JcrLexicon.CONTENT)) {
- // We only want to use the parent path to find the actual file ...
- path = path.getParent();
- }
- SVNNodeKind kind = rootNode;
- for (Path.Segment segment : path) {
- if (segment.getIndex() > 1) {
- I18n msg = SVNRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
- throw new RepositorySourceException(getSourceName(), msg.text("SVN Connector does not support Same Name Sibling"));
- }
- }
-
- String currentPath = getPathAsString(path);
- if (!this.accessData.getRepositoryRootUrl().equals(inWorkspace)) {
- if (currentPath.startsWith("/")) {
- currentPath = currentPath.substring(1);
- }
- }
- kind = SVNRepositoryUtil.checkThePath(workspaceRoot, currentPath, -1, getSourceName());
-
- if (kind != null) {
- if (kind == SVNNodeKind.NONE || kind == SVNNodeKind.UNKNOWN) {
- // Unable to complete the path, so prepare the exception by determining the lowest path that exists ...
- request.setError(new RepositorySourceException(getSourceName(), " Node kind with path " + currentPath
- + " is missing or actually unknown"));
- return null;
- }
- }
-
- assert kind != null;
- return kind;
- }
-
- protected void sameNameSiblingIsNotSupported( Path path ) {
- for (Path.Segment segment : path) {
- // Verify the segment is valid ...
- if (segment.getIndex() > 1) {
- I18n msg = SVNRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
- throw new RepositorySourceException(getSourceName(), msg.text("SVN Connector does not support Same Name Sibling"));
- }
- }
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.UpdatePropertiesRequest)
- */
- @Override
- public void process( UpdatePropertiesRequest request ) {
- logger.trace(request.toString());
- verifyUpdatesAllowed();
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.CopyBranchRequest)
- */
- @Override
- public void process( CopyBranchRequest request ) {
- updatesAllowed(request);
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.CloneBranchRequest)
- */
- @Override
- public void process( CloneBranchRequest request ) {
- updatesAllowed(request);
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.DeleteBranchRequest)
- */
- @Override
- public void process( DeleteBranchRequest request ) {
- logger.trace(request.toString());
- if (!updatesAllowed(request)) return;
-
- SVNRepository workspaceRoot = getWorkspaceDirectory(request.inWorkspace());
- assert workspaceRoot != null;
-
- NamespaceRegistry registry = getExecutionContext().getNamespaceRegistry();
-
- Path requestedPath = request.at().getPath();
- // svn connector does not support same name sibling
- sameNameSiblingIsNotSupported(requestedPath);
-
- if (!requestedPath.isRoot() && JcrLexicon.CONTENT.equals(requestedPath.getLastSegment().getName())) {
- Path p = requestedPath.getAncestor(1);
- if(p != null) {
- String itemPath = getPathAsString(p);
- if (itemPath.equals("") || itemPath.equals("/")) {
- return;
- }
- String filePath = null;
- if (!accessData.getRepositoryRootUrl().equals(request.inWorkspace())) {
- filePath = itemPath.substring(1);
- }
- try {
- //check if the file exist
- if (!SVNRepositoryUtil.exists(workspaceRoot, filePath)) return;
-
- //update the file
- SVNProperties fileProperties = new SVNProperties();
- ByteArrayOutputStream baos = new ByteArrayOutputStream();
- workspaceRoot.getFile(filePath, -1, fileProperties, baos);
-
- String rootPath = getPathAsString(p.getAncestor(1));
- String fileName = p.getLastSegment().getString(registry);
- modifyFile(rootPath, fileName, baos.toByteArray(), "".getBytes(), null, request.inWorkspace(), workspaceRoot);
-
- } catch (SVNException e) {
- throw new RepositorySourceException(getSourceName(),
- SVNRepositoryConnectorI18n.deleteFailed.text(itemPath, getSourceName()));
-
- }
- }
-
-
- } else {
-
- String nodePath = getPathAsString(requestedPath);
-
- if (!accessData.getRepositoryRootUrl().equals(request.inWorkspace())) {
- nodePath = nodePath.substring(1);
- }
-
- try {
- if (!SVNRepositoryUtil.exists(workspaceRoot, nodePath)) return;
- eraseEntry(nodePath, null, request.inWorkspace(), workspaceRoot);
- } catch (SVNException e) {
- throw new RepositorySourceException(getSourceName(),
- SVNRepositoryConnectorI18n.deleteFailed.text(nodePath, getSourceName()));
-
- }
- }
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.MoveBranchRequest)
- */
- @Override
- public void process( MoveBranchRequest request ) {
- updatesAllowed(request);
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.RenameNodeRequest)
- */
- @Override
- public void process( RenameNodeRequest request ) {
- if (updatesAllowed(request)) super.process(request);
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.VerifyWorkspaceRequest)
- */
- @Override
- public void process( VerifyWorkspaceRequest request ) {
- // If the request contains a null name, then we use the default ...
- String workspaceName = request.workspaceName();
- if (workspaceName == null) workspaceName = defaultWorkspace.getLocation().toDecodedString();
-
- SVNRepository repository = null;
- if (!this.creatingWorkspacesAllowed) {
- // Then the workspace name must be one of the available names ...
- boolean found = false;
- for (String available : this.availableWorkspaceNames) {
- if (workspaceName.equals(available)) {
- found = true;
- break;
- }
- repository = SVNRepositoryUtil.createRepository(available, accessData.getUsername(), accessData.getPassword());
- // check if the workspace is conform
- if (SVNRepositoryUtil.isDirectory(repository, "")
- && repository.getLocation().toDecodedString().equals(workspaceName)) {
- found = true;
- break;
- }
- }
- if (!found) {
- request.setError(new InvalidWorkspaceException(
- SVNRepositoryConnectorI18n.workspaceDoesNotExist.text(workspaceName)));
- return;
- }
- }
-
- // Verify that there is a repos at the path given by the workspace name ...
- repository = SVNRepositoryUtil.createRepository(workspaceName, accessData.getUsername(), accessData.getPassword());
- if (SVNRepositoryUtil.isDirectory(repository, "")) {
- request.setActualWorkspaceName(repository.getLocation().toDecodedString());
- request.setActualRootLocation(Location.create(pathFactory().createRootPath()));
- } else {
- request.setError(new InvalidWorkspaceException(SVNRepositoryConnectorI18n.workspaceDoesNotExist.text(workspaceName)));
- }
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.GetWorkspacesRequest)
- */
- @Override
- public void process( GetWorkspacesRequest request ) {
- // Return the set of available workspace names, even if new workspaces can be created ...
- Set<String> names = new HashSet<String>();
- for (String name : this.availableWorkspaceNames) {
- SVNRepository repos = SVNRepositoryUtil.createRepository(name, accessData.getUsername(), accessData.getPassword());
- if (repos != null && SVNRepositoryUtil.isDirectory(repos, "")) {
- names.add(repos.getLocation().toDecodedString());
- } else {
- request.setError(new InvalidWorkspaceException(SVNRepositoryConnectorI18n.workspaceDoesNotExist.text(name)));
- }
- }
- request.setAvailableWorkspaceNames(Collections.unmodifiableSet(names));
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.CloneWorkspaceRequest)
- */
- @Override
- public void process( CloneWorkspaceRequest request ) {
- if (!updatesAllowed) {
- request.setError(new InvalidRequestException(
- SVNRepositoryConnectorI18n.sourceDoesNotSupportCloningWorkspaces.text(getSourceName())));
- }
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.CreateWorkspaceRequest)
- */
- @Override
- public void process( CreateWorkspaceRequest request ) {
- final String workspaceName = request.desiredNameOfNewWorkspace();
- if (!creatingWorkspacesAllowed) {
- String msg = SVNRepositoryConnectorI18n.unableToCreateWorkspaces.text(getSourceName(), workspaceName);
- request.setError(new InvalidRequestException(msg));
- return;
- }
- // This doesn't create the directory representing the workspace (it must already exist), but it will add
- // the workspace name to the available names ...
- SVNRepository repository = SVNRepositoryUtil.createRepository(workspaceName,
- accessData.getUsername(),
- accessData.getPassword());
- if (SVNRepositoryUtil.isDirectory(repository, "")) {
- request.setActualWorkspaceName(repository.getLocation().toDecodedString());
- request.setActualRootLocation(Location.create(pathFactory().createRootPath()));
- availableWorkspaceNames.add(repository.getLocation().toDecodedString());
- } else {
- request.setError(new InvalidWorkspaceException(SVNRepositoryConnectorI18n.workspaceDoesNotExist.text(workspaceName)));
- }
-
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.request.processor.RequestProcessor#process(org.jboss.dna.graph.request.DestroyWorkspaceRequest)
- */
- @Override
- public void process( DestroyWorkspaceRequest request ) {
- final String workspaceName = request.workspaceName();
- if (!creatingWorkspacesAllowed) {
- String msg = SVNRepositoryConnectorI18n.unableToCreateWorkspaces.text(getSourceName(), workspaceName);
- request.setError(new InvalidRequestException(msg));
- }
- // This doesn't delete the file/directory; rather, it just remove the workspace from the available set ...
- if (!this.availableWorkspaceNames.remove(workspaceName)) {
- request.setError(new InvalidWorkspaceException(SVNRepositoryConnectorI18n.workspaceDoesNotExist.text(workspaceName)));
- }
- }
-
- /**
- * Get the repository driver.
- *
- * @return repository
- */
- public SVNRepository getDefaultWorkspace() {
- return defaultWorkspace;
- }
-
- /**
- * Get the last revision.
- *
- * @param repos
- * @return the last revision number.
- * @throws Exception
- */
- public long getLatestRevision( SVNRepository repos ) throws Exception {
- try {
- return repos.getLatestRevision();
- } catch (SVNException e) {
- e.printStackTrace();
- // logger.error( "svn error: " );
- throw e;
- }
- }
-
- /**
- * Create a directory .
- *
- * @param rootDirPath - the root directory where the created directory will reside
- * @param childDirPath - the name of the created directory.
- * @param comment - comment for the creation.
- * @param inWorkspace
- * @param currentRepository
- * @throws SVNException - if during the creation, there is an error.
- */
- private void mkdir( String rootDirPath,
- String childDirPath,
- String comment,
- String inWorkspace,
- SVNRepository currentRepository ) throws SVNException {
-
- String tempParentPath = rootDirPath;
- if (!this.accessData.getRepositoryRootUrl().equals(inWorkspace)) {
- if (!tempParentPath.equals("/") && tempParentPath.startsWith("/")) {
- tempParentPath = tempParentPath.substring(1);
- } else if (tempParentPath.equals("/")) {
- tempParentPath = "";
- }
- }
- String checkPath = tempParentPath.length() == 0 ? childDirPath : tempParentPath + "/" + childDirPath;
- SVNNodeKind nodeKind = null;
- try {
- nodeKind = currentRepository.checkPath(checkPath, -1);
- } catch (SVNException e) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "May be a Connecting problem to the repository or a user's authentication failure: {0}",
- e.getMessage());
- throw new SVNException(err);
- }
-
- if (nodeKind != null && nodeKind == SVNNodeKind.NONE) {
- ScmAction addNodeAction = addDirectory(rootDirPath, childDirPath);
- SVNActionExecutor executor = new SVNActionExecutor(currentRepository);
- comment = comment == null ? "Create a new file " + childDirPath : comment;
- executor.execute(addNodeAction, comment);
- } else {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "Node with name '{0}' can't be created",
- childDirPath);
- throw new SVNException(err);
- }
- }
-
- /**
- * Create a file.
- *
- * @param rootDirPath
- * @param childFilePath
- * @param content
- * @param comment
- * @param inWorkspace
- * @param currentRepository
- * @throws SVNException
- */
- private void newFile( String rootDirPath,
- String childFilePath,
- byte[] content,
- String comment,
- String inWorkspace,
- SVNRepository currentRepository ) throws SVNException {
-
- String tempParentPath = rootDirPath;
- if (!this.accessData.getRepositoryRootUrl().equals(inWorkspace)) {
- if (!tempParentPath.equals("/") && tempParentPath.startsWith("/")) {
- tempParentPath = tempParentPath.substring(1);
- }
- }
- String checkPath = tempParentPath + "/" + childFilePath;
- SVNNodeKind nodeKind = null;
- try {
- nodeKind = currentRepository.checkPath(checkPath, -1);
- } catch (SVNException e) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "May be a Connecting problem to the repository or a user's authentication failure: {0}",
- e.getMessage());
- throw new SVNException(err);
- }
-
- if (nodeKind != null && nodeKind == SVNNodeKind.NONE) {
- ScmAction addFileNodeAction = addFile(rootDirPath, childFilePath, content);
- SVNActionExecutor executor = new SVNActionExecutor(currentRepository);
- comment = comment == null ? "Create a new file " + childFilePath : comment;
- executor.execute(addFileNodeAction, comment);
- } else {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "Item with name '{0}' can't be created (already exist)",
- childFilePath);
- throw new SVNException(err);
- }
- }
-
- /**
- * Modify a file
- *
- * @param rootPath
- * @param fileName
- * @param oldData
- * @param newData
- * @param comment
- * @param inWorkspace
- * @param currentRepository
- * @throws SVNException
- */
- private void modifyFile( String rootPath,
- String fileName,
- byte[] oldData,
- byte[] newData,
- String comment,
- String inWorkspace,
- SVNRepository currentRepository ) throws SVNException {
- assert rootPath != null;
- assert fileName != null;
- assert oldData != null;
- assert inWorkspace != null;
- assert currentRepository != null;
-
- try {
-
- if (!this.accessData.getRepositoryRootUrl().equals(inWorkspace)) {
- if (rootPath.equals("/")) {
- rootPath = "";
- } else {
- rootPath = rootPath.substring(1) + "/";
- }
- } else {
- if (!rootPath.equals("/")) {
- rootPath = rootPath + "/";
- }
- }
- String path = rootPath + fileName;
-
- SVNNodeKind nodeKind = currentRepository.checkPath(path, -1);
- if (nodeKind == SVNNodeKind.NONE || nodeKind == SVNNodeKind.UNKNOWN) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.ENTRY_NOT_FOUND,
- "Item with name '{0}' can't be found",
- path);
- throw new SVNException(err);
- }
-
- ScmAction modifyFileAction = updateFile(rootPath, fileName, oldData, newData);
- SVNActionExecutor executor = new SVNActionExecutor(currentRepository);
- comment = comment == null ? "modify the " + fileName : comment;
- executor.execute(modifyFileAction, comment);
-
- } catch (SVNException e) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN, "This error is appeared: '{0}'", e.getMessage());
- throw new SVNException(err);
- }
-
- }
-
- /**
- * Delete entry from the repository
- *
- * @param path
- * @param comment
- * @param inWorkspace
- * @param currentRepository
- * @throws SVNException
- */
- private void eraseEntry( String path,
- String comment,
- String inWorkspace,
- SVNRepository currentRepository ) throws SVNException {
- assert path != null;
- assert inWorkspace != null;
- if (path.equals("/") || path.equals("")) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.BAD_URL, "The root directory cannot be deleted");
- throw new SVNException(err);
- }
-
- try {
- ScmAction deleteEntryAction = deleteEntry(path);
- SVNActionExecutor executor = new SVNActionExecutor(currentRepository);
- comment = comment == null ? "Delete the " + path : comment;
- executor.execute(deleteEntryAction, comment);
- } catch (SVNException e) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "unknow error during delete action: {0)",
- e.getMessage());
- throw new SVNException(err);
- }
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.connector.scm.ScmActionFactory#addDirectory(java.lang.String, java.lang.String)
- */
- public ScmAction addDirectory( String root,
- String path ) {
- return new AddDirectory(root, path);
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.connector.scm.ScmActionFactory#addFile(java.lang.String, java.lang.String, byte[])
- */
- public ScmAction addFile( String path,
- String file,
- byte[] content ) {
- return new AddFile(path, file, content);
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.connector.scm.ScmActionFactory#updateFile(java.lang.String, java.lang.String, byte[], byte[])
- */
- public ScmAction updateFile( String rootPath,
- String fileName,
- byte[] oldData,
- byte[] newData ) {
- return new UpdateFile(rootPath, fileName, oldData, newData);
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.connector.scm.ScmActionFactory#deleteEntry(java.lang.String)
- */
- public ScmAction deleteEntry( String path ) {
- return new DeleteEntry(path);
- }
-
- protected void checkThePath( Path path,
- Request request ) {
- for (Path.Segment segment : path) {
- // Verify the segment is valid ...
- if (segment.getIndex() > 1) {
- I18n msg = SVNRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
- throw new RepositorySourceException(getSourceName(), msg.text("SVN Connector does not support Same Name Sibling"));
- }
- }
- }
-
- protected SVNRepository getWorkspaceDirectory( String workspaceName ) {
- SVNRepository repository = defaultWorkspace;
- if (workspaceName != null) {
- SVNRepository repos = SVNRepositoryUtil.createRepository(workspaceName,
- accessData.getUsername(),
- accessData.getPassword());
- if (SVNRepositoryUtil.isDirectory(repos, "")) {
- repository = repos;
- } else {
- return null;
- }
- }
- return repository;
- }
-
- protected SVNNodeKind getNodeKind( SVNRepository repository,
- Path path,
- String repositoryRootUrl,
- String inWorkspace ) throws SVNException {
- assert path != null;
- assert repositoryRootUrl != null;
- assert inWorkspace != null;
- // See if the path is a "jcr:content" node ...
- if (path.endsWith(JcrLexicon.CONTENT)) {
- // We only want to use the parent path to find the actual file ...
- path = path.getParent();
- }
- String pathAsString = getPathAsString(path);
- if (!repositoryRootUrl.equals(inWorkspace)) {
- pathAsString = pathAsString.substring(1);
- }
- SVNNodeKind kind = repository.checkPath(pathAsString, -1);
- if (kind == SVNNodeKind.NONE) {
- // node does not exist or requested node is not correct.
- throw new PathNotFoundException(Location.create(path), null,
- SVNRepositoryConnectorI18n.nodeDoesNotExist.text(pathAsString));
- } else if (kind == SVNNodeKind.UNKNOWN) {
- // node is unknown
- throw new PathNotFoundException(Location.create(path), null,
- SVNRepositoryConnectorI18n.nodeIsActuallyUnknow.text(pathAsString));
- }
- return kind;
- }
-
- /**
- * Checks that the collection of {@code properties} only contains properties with allowable names.
- *
- * @param properties
- * @param validPropertyNames
- * @throws RepositorySourceException if {@code properties} contains a
- * @see #ALLOWABLE_PROPERTIES_FOR_CONTENT
- * @see #ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER
- */
- protected void ensureValidProperties( Collection<Property> properties,
- Set<Name> validPropertyNames ) {
- List<String> invalidNames = new LinkedList<String>();
- NamespaceRegistry registry = getExecutionContext().getNamespaceRegistry();
-
- for (Property property : properties) {
- if (!validPropertyNames.contains(property.getName())) {
- invalidNames.add(property.getName().getString(registry));
- }
- }
-
- if (!invalidNames.isEmpty()) {
- throw new RepositorySourceException(this.getSourceName(),
- SVNRepositoryConnectorI18n.invalidPropertyNames.text(invalidNames.toString()));
- }
- }
-
- /**
- * Validate the kind of node and throws an exception if necessary.
- *
- * @param repos
- * @param requestedPath
- * @return the kind.
- */
- protected SVNNodeKind validateNodeKind( SVNRepository repos,
- Path requestedPath ) {
- SVNNodeKind kind;
- String myPath;
- if (getPathAsString(requestedPath).trim().equals("/")) {
- myPath = getPathAsString(requestedPath);
- } else if (requestedPath.endsWith(JcrLexicon.CONTENT)) {
- myPath = getPathAsString(requestedPath.getParent());
- } else {
- // directory and file
- myPath = getPathAsString(requestedPath);
- }
-
- try {
-
- kind = repos.checkPath(myPath, -1);
- if (kind == SVNNodeKind.NONE) {
- // node does not exist or requested node is not correct.
- throw new PathNotFoundException(Location.create(requestedPath), null,
- SVNRepositoryConnectorI18n.nodeDoesNotExist.text(myPath));
- } else if (kind == SVNNodeKind.UNKNOWN) {
- // node is unknown
- throw new PathNotFoundException(Location.create(requestedPath), null,
- SVNRepositoryConnectorI18n.nodeIsActuallyUnknow.text(myPath));
- }
- } catch (SVNException e) {
- throw new RepositorySourceException(
- getSourceName(),
- SVNRepositoryConnectorI18n.connectingFailureOrUserAuthenticationProblem.text(getSourceName()));
- }
-
- return kind;
- }
-
- /**
- * Verify if change is allowed on a specific source.
- *
- * @throws RepositorySourceException if change on that repository source is not allowed.
- */
- protected void verifyUpdatesAllowed() {
- if (!updatesAllowed) {
- throw new InvalidRequestException(SVNRepositoryConnectorI18n.sourceIsReadOnly.text(getSourceName()));
- }
- }
-
- protected boolean updatesAllowed( Request request ) {
- if (!updatesAllowed) {
- request.setError(new InvalidRequestException(SVNRepositoryConnectorI18n.sourceIsReadOnly.text(getSourceName())));
- }
- return !request.hasError();
- }
-
- /**
- * Factory for sample name.
- *
- * @return the name factory
- */
- protected NameFactory nameFactory() {
- return getExecutionContext().getValueFactories().getNameFactory();
- }
-
- /**
- * Factory for path creation.
- *
- * @return a path factory.
- */
- protected PathFactory pathFactory() {
- return getExecutionContext().getValueFactories().getPathFactory();
- }
-
- /**
- * Factory for property creation.
- *
- * @return the property factory.
- */
- protected PropertyFactory propertyFactory() {
- return getExecutionContext().getPropertyFactory();
- }
-
- /**
- * Factory for date creation.
- *
- * @return the date factory.
- */
- protected DateTimeFactory dateFactory() {
- return getExecutionContext().getValueFactories().getDateFactory();
- }
-
- /**
- * Factory for binary creation.
- *
- * @return the binary factory..
- */
- protected ValueFactory<Binary> binaryFactory() {
- return getExecutionContext().getValueFactories().getBinaryFactory();
- }
-
- /**
- * Get the path for a locarion and check if the path is null or not.
- *
- * @param location - the location.
- * @param request - the requested path.
- * @return the path.
- * @throws RepositorySourceException if the path of a location is null.
- */
- protected Path getPathFor( Location location,
- Request request ) {
- Path path = location.getPath();
- if (path == null) {
- I18n msg = SVNRepositoryConnectorI18n.locationInRequestMustHavePath;
- throw new RepositorySourceException(getSourceName(), msg.text(getSourceName(), request));
- }
- return path;
- }
-
- /**
- * Get the content of a file.
- *
- * @param path - the path to that file.
- * @param properties - the properties of the file.
- * @param os - the output stream where to store the content.
- * @throws SVNException - throws if such path is not at that revision or in case of a connection problem.
- */
- protected void getData( String path,
- SVNProperties properties,
- OutputStream os ) throws SVNException {
- getDefaultWorkspace().getFile(path, -1, properties, os);
-
- }
-
- protected String getPathAsString( Path path ) {
- return path.getString(getExecutionContext().getNamespaceRegistry());
- }
-
- /**
- * Get some important informations of a path
- *
- * @param repos
- * @param path - the path
- * @return - the {@link SVNDirEntry}, or null if there is no such entry
- */
- protected SVNDirEntry getEntryInfo( SVNRepository repos,
- String path ) {
- assert path != null;
- SVNDirEntry entry = null;
- try {
- entry = repos.info(path, -1);
- } catch (SVNException e) {
- throw new RepositorySourceException(
- getSourceName(),
- SVNRepositoryConnectorI18n.connectingFailureOrUserAuthenticationProblem.text(getSourceName()));
- }
- return entry;
- }
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositorySource.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositorySource.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositorySource.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,581 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-import java.util.Enumeration;
-import java.util.HashMap;
-import java.util.Hashtable;
-import java.util.List;
-import java.util.Map;
-import java.util.concurrent.CopyOnWriteArraySet;
-import javax.naming.Context;
-import javax.naming.Name;
-import javax.naming.RefAddr;
-import javax.naming.Reference;
-import javax.naming.StringRefAddr;
-import javax.naming.spi.ObjectFactory;
-import net.jcip.annotations.Immutable;
-import net.jcip.annotations.ThreadSafe;
-import org.jboss.dna.common.i18n.I18n;
-import org.jboss.dna.common.util.CheckArg;
-import org.jboss.dna.common.util.Logger;
-import org.jboss.dna.common.util.StringUtil;
-import org.jboss.dna.graph.cache.CachePolicy;
-import org.jboss.dna.graph.connector.RepositoryConnection;
-import org.jboss.dna.graph.connector.RepositoryContext;
-import org.jboss.dna.graph.connector.RepositorySource;
-import org.jboss.dna.graph.connector.RepositorySourceCapabilities;
-import org.jboss.dna.graph.connector.RepositorySourceException;
-import org.tmatesoft.svn.core.io.SVNRepository;
-
-/**
- * The {@link RepositorySource} for the connector that exposes an area of the local/remote svn repository as content in a
- * repository. This source considers a workspace name to be the path to the directory on the repository's root directory location
- * that represents the root of that workspace. New workspaces can be created, as long as the names represent valid paths to
- * existing directories.
- */
-@ThreadSafe
-public class SVNRepositorySource implements RepositorySource, ObjectFactory {
-
- /**
- * The first serialized version of this source. Version {@value} .
- */
- private static final long serialVersionUID = 1L;
-
- protected static final String SOURCE_NAME = "sourceName";
- protected static final String SVN_REPOSITORY_ROOT_URL = "repositoryRootURL";
- protected static final String SVN_USERNAME = "username";
- protected static final String SVN_PASSWORD = "password";
- protected static final String CACHE_TIME_TO_LIVE_IN_MILLISECONDS = "cacheTimeToLiveInMilliseconds";
- protected static final String RETRY_LIMIT = "retryLimit";
- protected static final String DEFAULT_WORKSPACE = "defaultWorkspace";
- protected static final String PREDEFINED_WORKSPACE_NAMES = "predefinedWorkspaceNames";
- protected static final String ALLOW_CREATING_WORKSPACES = "allowCreatingWorkspaces";
-
- /**
- * This source supports events.
- */
- protected static final boolean SUPPORTS_EVENTS = true;
- /**
- * This source supports same-name-siblings.
- */
- protected static final boolean SUPPORTS_SAME_NAME_SIBLINGS = false;
- /**
- * This source does support creating workspaces.
- */
- protected static final boolean DEFAULT_SUPPORTS_CREATING_WORKSPACES = true;
- /**
- * This source supports udpates by default, but each instance may be configured to be read-only or updateable}.
- */
- public static final boolean DEFAULT_SUPPORTS_UPDATES = false;
-
- /**
- * This source supports creating references.
- */
- protected static final boolean SUPPORTS_REFERENCES = false;
-
- public static final int DEFAULT_RETRY_LIMIT = 0;
- public static final int DEFAULT_CACHE_TIME_TO_LIVE_IN_SECONDS = 60 * 5; // 5
- // minutes
-
- private volatile String name;
- private volatile String repositoryRootURL;
- private volatile String username;
- private volatile String password;
- private volatile int retryLimit = DEFAULT_RETRY_LIMIT;
- private volatile int cacheTimeToLiveInMilliseconds = DEFAULT_CACHE_TIME_TO_LIVE_IN_SECONDS * 1000;
- private volatile String defaultWorkspace;
- private volatile String[] predefinedWorkspaces = new String[] {};
- private volatile RepositorySourceCapabilities capabilities = new RepositorySourceCapabilities(
- SUPPORTS_SAME_NAME_SIBLINGS,
- DEFAULT_SUPPORTS_UPDATES,
- SUPPORTS_EVENTS,
- DEFAULT_SUPPORTS_CREATING_WORKSPACES,
- SUPPORTS_REFERENCES);
-
- private transient CachePolicy cachePolicy;
- private transient CopyOnWriteArraySet<String> availableWorspaceNames;
-
- /**
- * Create a repository source instance.
- */
- public SVNRepositorySource() {
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositorySource#getCapabilities()
- */
- public RepositorySourceCapabilities getCapabilities() {
- return capabilities;
- }
-
- /**
- * {@inheritDoc}
- */
- public String getName() {
- return this.name;
- }
-
- /**
- * Set the name for the source
- *
- * @param name the new name for the source
- */
- public synchronized void setName( String name ) {
- if (name != null) {
- name = name.trim();
- if (name.length() == 0) name = null;
- }
- this.name = name;
- }
-
- /**
- * @return the url
- */
- public String getRepositoryRootURL() {
- return this.repositoryRootURL;
- }
-
- /**
- * Set the url for the subversion repository.
- *
- * @param url - the url location.
- * @throws IllegalArgumentException If svn url is null or empty
- */
- public synchronized void setRepositoryRootURL( String url ) {
- CheckArg.isNotEmpty(url, "RepositoryRootURL");
- this.repositoryRootURL = url;
- }
-
- public String getUsername() {
- return this.username;
- }
-
- /**
- * @param username
- */
- public synchronized void setUsername( String username ) {
- this.username = username;
- }
-
- public String getPassword() {
- return this.password;
- }
-
- /**
- * @param password
- */
- public synchronized void setPassword( String password ) {
- this.password = password;
- }
-
- /**
- * Get whether this source supports updates.
- *
- * @return true if this source supports updates, or false if this source only supports reading content.
- */
- public boolean getSupportsUpdates() {
- return capabilities.supportsUpdates();
- }
-
- /**
- * Get the file system path to the existing directory that should be used for the default workspace. If the default is
- * specified as a null String or is not a valid and resolvable path, this source will consider the default to be the current
- * working directory of this virtual machine, as defined by the <code>new File(".")</code>.
- *
- * @return the file system path to the directory representing the default workspace, or null if the default should be the
- * current working directory
- */
- public String getDirectoryForDefaultWorkspace() {
- return defaultWorkspace;
- }
-
- /**
- * Set the file system path to the existing directory that should be used for the default workspace. If the default is
- * specified as a null String or is not a valid and resolvable path, this source will consider the default to be the current
- * working directory of this virtual machine, as defined by the <code>new File(".")</code>.
- *
- * @param pathToDirectoryForDefaultWorkspace the valid and resolvable file system path to the directory representing the
- * default workspace, or null if the current working directory should be used as the default workspace
- */
- public synchronized void setDirectoryForDefaultWorkspace( String pathToDirectoryForDefaultWorkspace ) {
- this.defaultWorkspace = pathToDirectoryForDefaultWorkspace;
- }
-
- /**
- * Gets the names of the workspaces that are available when this source is created. Each workspace name corresponds to a path
- * to a directory on the file system.
- *
- * @return the names of the workspaces that this source starts with, or null if there are no such workspaces
- * @see #setPredefinedWorkspaceNames(String[])
- * @see #setCreatingWorkspacesAllowed(boolean)
- */
- public synchronized String[] getPredefinedWorkspaceNames() {
- String[] copy = new String[predefinedWorkspaces.length];
- System.arraycopy(predefinedWorkspaces, 0, copy, 0, predefinedWorkspaces.length);
- return copy;
- }
-
- /**
- * Sets the names of the workspaces that are available when this source is created. Each workspace name corresponds to a path
- * to a directory on the file system.
- *
- * @param predefinedWorkspaceNames the names of the workspaces that this source should start with, or null if there are no
- * such workspaces
- * @see #setCreatingWorkspacesAllowed(boolean)
- * @see #getPredefinedWorkspaceNames()
- */
- public synchronized void setPredefinedWorkspaceNames( String[] predefinedWorkspaceNames ) {
- this.predefinedWorkspaces = predefinedWorkspaceNames;
- }
-
- /**
- * Get whether this source allows workspaces to be created dynamically.
- *
- * @return true if this source allows workspaces to be created by clients, or false if the set of workspaces is fixed
- * @see #setPredefinedWorkspaceNames(String[])
- * @see #getPredefinedWorkspaceNames()
- * @see #setCreatingWorkspacesAllowed(boolean)
- */
- public boolean isCreatingWorkspacesAllowed() {
- return capabilities.supportsCreatingWorkspaces();
- }
-
- /**
- * Set whether this source allows workspaces to be created dynamically.
- *
- * @param allowWorkspaceCreation true if this source allows workspaces to be created by clients, or false if the set of
- * workspaces is fixed
- * @see #setPredefinedWorkspaceNames(String[])
- * @see #getPredefinedWorkspaceNames()
- * @see #isCreatingWorkspacesAllowed()
- */
- public synchronized void setCreatingWorkspacesAllowed( boolean allowWorkspaceCreation ) {
- capabilities = new RepositorySourceCapabilities(capabilities.supportsSameNameSiblings(), capabilities.supportsUpdates(),
- capabilities.supportsEvents(), allowWorkspaceCreation,
- capabilities.supportsReferences());
- }
-
- /**
- * Get whether this source allows updates.
- *
- * @return true if this source allows updates by clients, or false if no updates are allowed
- * @see #setUpdatesAllowed(boolean)
- */
- public boolean areUpdatesAllowed() {
- return capabilities.supportsUpdates();
- }
-
- /**
- * Set whether this source allows updates to data within workspaces
- *
- * @param allowUpdates true if this source allows updates to data within workspaces clients, or false if updates are not
- * allowed.
- * @see #areUpdatesAllowed()
- */
- public synchronized void setUpdatesAllowed( boolean allowUpdates ) {
- capabilities = new RepositorySourceCapabilities(capabilities.supportsSameNameSiblings(), allowUpdates,
- capabilities.supportsEvents(), capabilities.supportsCreatingWorkspaces(),
- capabilities.supportsReferences());
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositorySource#getRetryLimit()
- */
- public int getRetryLimit() {
- return retryLimit;
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositorySource#setRetryLimit(int)
- */
- public void setRetryLimit( int limit ) {
- retryLimit = limit < 0 ? 0 : limit;
- }
-
- /**
- * Get the time in milliseconds that content returned from this source may used while in the cache.
- *
- * @return the time to live, in milliseconds, or 0 if the time to live is not specified by this source
- */
- public int getCacheTimeToLiveInMilliseconds() {
- return cacheTimeToLiveInMilliseconds;
- }
-
- /**
- * Set the time in milliseconds that content returned from this source may used while in the cache.
- *
- * @param cacheTimeToLive the time to live, in milliseconds; 0 if the time to live is not specified by this source; or a
- * negative number for the default value
- */
- public synchronized void setCacheTimeToLiveInMilliseconds( int cacheTimeToLive ) {
- if (cacheTimeToLive < 0) cacheTimeToLive = DEFAULT_CACHE_TIME_TO_LIVE_IN_SECONDS;
- this.cacheTimeToLiveInMilliseconds = cacheTimeToLive;
- this.cachePolicy = cacheTimeToLiveInMilliseconds > 0 ? new SVNRepositoryCachePolicy(cacheTimeToLiveInMilliseconds) : null;
-
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositorySource#initialize(org.jboss.dna.graph.connector.RepositoryContext)
- */
- public synchronized void initialize( RepositoryContext context ) throws RepositorySourceException {
- // No need to do anything
- }
-
- /**
- * {@inheritDoc}
- */
- @Override
- public boolean equals( Object obj ) {
- if (obj == this) return true;
- if (obj instanceof SVNRepositorySource) {
- SVNRepositorySource that = (SVNRepositorySource)obj;
- if (this.getName() == null) {
- if (that.getName() != null) return false;
- } else {
- if (!this.getName().equals(that.getName())) return false;
- }
- return true;
- }
- return false;
- }
-
- /**
- * {@inheritDoc}
- *
- * @see javax.naming.Referenceable#getReference()
- */
- public synchronized Reference getReference() {
- String className = getClass().getName();
- String factoryClassName = this.getClass().getName();
- Reference ref = new Reference(className, factoryClassName, null);
-
- if (getName() != null) {
- ref.add(new StringRefAddr(SOURCE_NAME, getName()));
- }
- if (getRepositoryRootURL() != null) {
- ref.add(new StringRefAddr(SVN_REPOSITORY_ROOT_URL, getRepositoryRootURL()));
- }
- if (getUsername() != null) {
- ref.add(new StringRefAddr(SVN_USERNAME, getUsername()));
- }
- if (getPassword() != null) {
- ref.add(new StringRefAddr(SVN_PASSWORD, getPassword()));
- }
- ref.add(new StringRefAddr(CACHE_TIME_TO_LIVE_IN_MILLISECONDS, Integer.toString(getCacheTimeToLiveInMilliseconds())));
- ref.add(new StringRefAddr(RETRY_LIMIT, Integer.toString(getRetryLimit())));
- ref.add(new StringRefAddr(DEFAULT_WORKSPACE, getDirectoryForDefaultWorkspace()));
- ref.add(new StringRefAddr(ALLOW_CREATING_WORKSPACES, Boolean.toString(isCreatingWorkspacesAllowed())));
- String[] workspaceNames = getPredefinedWorkspaceNames();
- if (workspaceNames != null && workspaceNames.length != 0) {
- ref.add(new StringRefAddr(PREDEFINED_WORKSPACE_NAMES, StringUtil.combineLines(workspaceNames)));
- }
- return ref;
-
- }
-
- /**
- * {@inheritDoc}
- *
- * @see javax.naming.spi.ObjectFactory#getObjectInstance(java.lang.Object, javax.naming.Name, javax.naming.Context,
- * java.util.Hashtable)
- */
- public Object getObjectInstance( Object obj,
- Name name,
- Context nameCtx,
- Hashtable<?, ?> environment ) throws Exception {
- if (obj instanceof Reference) {
- Map<String, String> values = new HashMap<String, String>();
- Reference ref = (Reference)obj;
- Enumeration<?> en = ref.getAll();
- while (en.hasMoreElements()) {
- RefAddr subref = (RefAddr)en.nextElement();
- if (subref instanceof StringRefAddr) {
- String key = subref.getType();
- Object value = subref.getContent();
- if (value != null) values.put(key, value.toString());
- }
- }
- String sourceName = values.get(SOURCE_NAME);
- String repositoryRootURL = values.get(SVN_REPOSITORY_ROOT_URL);
- String username = values.get(SVN_USERNAME);
- String password = values.get(SVN_PASSWORD);
- String cacheTtlInMillis = values.get(CACHE_TIME_TO_LIVE_IN_MILLISECONDS);
- String retryLimit = values.get(RETRY_LIMIT);
- String defaultWorkspace = values.get(DEFAULT_WORKSPACE);
- String createWorkspaces = values.get(ALLOW_CREATING_WORKSPACES);
-
- String combinedWorkspaceNames = values.get(PREDEFINED_WORKSPACE_NAMES);
- String[] workspaceNames = null;
- if (combinedWorkspaceNames != null) {
- List<String> paths = StringUtil.splitLines(combinedWorkspaceNames);
- workspaceNames = paths.toArray(new String[paths.size()]);
- }
- // Create the source instance ...
- SVNRepositorySource source = new SVNRepositorySource();
- if (sourceName != null) source.setName(sourceName);
- if (cacheTtlInMillis != null) source.setCacheTimeToLiveInMilliseconds(Integer.parseInt(cacheTtlInMillis));
- if (repositoryRootURL != null) source.setRepositoryRootURL(repositoryRootURL);
- if (username != null) source.setUsername(username);
- if (password != null) source.setPassword(password);
- if (retryLimit != null) source.setRetryLimit(Integer.parseInt(retryLimit));
- if (defaultWorkspace != null) source.setDirectoryForDefaultWorkspace(defaultWorkspace);
- if (createWorkspaces != null) source.setCreatingWorkspacesAllowed(Boolean.parseBoolean(createWorkspaces));
- if (workspaceNames != null && workspaceNames.length != 0) source.setPredefinedWorkspaceNames(workspaceNames);
- return source;
- }
- return null;
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositorySource#getConnection()
- */
- public synchronized RepositoryConnection getConnection() throws RepositorySourceException {
-
- String sourceName = getName();
- if (sourceName == null || sourceName.trim().length() == 0) {
- I18n msg = SVNRepositoryConnectorI18n.propertyIsRequired;
- throw new RepositorySourceException(getName(), msg.text("name"));
- }
-
- String sourceUsername = getUsername();
- if (sourceUsername == null || sourceUsername.trim().length() == 0) {
- I18n msg = SVNRepositoryConnectorI18n.propertyIsRequired;
- throw new RepositorySourceException(getUsername(), msg.text("username"));
- }
-
- String sourcePassword = getPassword();
- if (sourcePassword == null) {
- I18n msg = SVNRepositoryConnectorI18n.propertyIsRequired;
- throw new RepositorySourceException(getPassword(), msg.text("password"));
- }
-
- String repositoryRootURL = getRepositoryRootURL();
- if (repositoryRootURL == null || repositoryRootURL.trim().length() == 0) {
- I18n msg = SVNRepositoryConnectorI18n.propertyIsRequired;
- throw new RepositorySourceException(getRepositoryRootURL(), msg.text("repositoryRootURL"));
- }
-
- SVNRepository repos = null;
- // Report the warnings for non-existant predefined workspaces
- boolean reportWarnings = false;
- if (this.availableWorspaceNames == null) {
- // Set up the predefined workspace names ...
- this.availableWorspaceNames = new CopyOnWriteArraySet<String>();
- for (String predefined : this.predefinedWorkspaces) {
- // if exist e.i trunk/ /branches /tags
- this.availableWorspaceNames.add(predefined);
- }
- // Report the warnings for non-existant predefined workspaces and we
- // take it that if no predefined workspace exist
- // we will take the repository root url as a pseudo workspace
- reportWarnings = true;
- for (String url : this.availableWorspaceNames) {
- // check if the predefined workspaces exist.
- if (repos != null) {
- SVNRepositoryUtil.setNewSVNRepositoryLocation(repos, url, true, sourceName);
- } else {
- repos = SVNRepositoryUtil.createRepository(url, sourceUsername, sourcePassword);
- }
- if (!SVNRepositoryUtil.exist(repos)) {
-
- Logger.getLogger(getClass()).warn(SVNRepositoryConnectorI18n.pathForPredefinedWorkspaceDoesNotExist,
- url,
- name);
- }
- if (!SVNRepositoryUtil.isDirectory(repos, "")) {
- Logger.getLogger(getClass()).warn(SVNRepositoryConnectorI18n.pathForPredefinedWorkspaceIsNotDirectory,
- url,
- name);
- }
- }
- }
-
- boolean supportsUpdates = getSupportsUpdates();
-
- SVNRepository defaultWorkspace = null;
- if (repos != null) {
- SVNRepositoryUtil.setNewSVNRepositoryLocation(repos, getRepositoryRootURL(), true, sourceName);
- defaultWorkspace = repos;
- } else {
- defaultWorkspace = SVNRepositoryUtil.createRepository(getRepositoryRootURL(), sourceUsername, sourcePassword);
- }
-
- String defaultURL = getDirectoryForDefaultWorkspace();
- if (defaultURL != null) {
- // Look for the entry at this path .....
- SVNRepository repository = SVNRepositoryUtil.createRepository(defaultURL, sourceUsername, sourcePassword);
- I18n warning = null;
- if (!SVNRepositoryUtil.exist(repository)) {
- warning = SVNRepositoryConnectorI18n.pathForPredefinedWorkspaceDoesNotExist;
- } else if (!SVNRepositoryUtil.isDirectory(repository, "")) {
- warning = SVNRepositoryConnectorI18n.pathForPredefinedWorkspaceIsNotDirectory;
- } else {
- // is a directory and is good to use!
- defaultWorkspace = repository;
- }
- if (reportWarnings && warning != null) {
- Logger.getLogger(getClass()).warn(warning, defaultURL, name);
- }
- }
- this.availableWorspaceNames.add(defaultWorkspace.getLocation().toDecodedString());
- return new SVNRepositoryConnection(name, defaultWorkspace, availableWorspaceNames, isCreatingWorkspacesAllowed(),
- cachePolicy, supportsUpdates, new RepositoryAccessData(getRepositoryRootURL(),
- sourceUsername, sourcePassword));
- }
-
- /**
- * {@inheritDoc}
- *
- * @see org.jboss.dna.graph.connector.RepositorySource#close()
- */
- public synchronized void close() {
- this.availableWorspaceNames = null;
- }
-
- @Immutable
- /* package */class SVNRepositoryCachePolicy implements CachePolicy {
- private static final long serialVersionUID = 1L;
- private final int ttl;
-
- /* package */SVNRepositoryCachePolicy( int ttl ) {
- this.ttl = ttl;
- }
-
- public long getTimeToLive() {
- return ttl;
- }
-
- }
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryUtil.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryUtil.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/SVNRepositoryUtil.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,235 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-package org.jboss.dna.connector.svn;
-
-import java.util.Collection;
-import java.util.Collections;
-import org.jboss.dna.graph.connector.RepositorySourceException;
-import org.jboss.dna.graph.request.InvalidWorkspaceException;
-import org.tmatesoft.svn.core.SVNDirEntry;
-import org.tmatesoft.svn.core.SVNErrorCode;
-import org.tmatesoft.svn.core.SVNErrorMessage;
-import org.tmatesoft.svn.core.SVNException;
-import org.tmatesoft.svn.core.SVNNodeKind;
-import org.tmatesoft.svn.core.SVNURL;
-import org.tmatesoft.svn.core.auth.ISVNAuthenticationManager;
-import org.tmatesoft.svn.core.internal.io.dav.DAVRepositoryFactory;
-import org.tmatesoft.svn.core.internal.io.fs.FSRepositoryFactory;
-import org.tmatesoft.svn.core.internal.io.svn.SVNRepositoryFactoryImpl;
-import org.tmatesoft.svn.core.io.SVNRepository;
-import org.tmatesoft.svn.core.io.SVNRepositoryFactory;
-import org.tmatesoft.svn.core.wc.SVNWCUtil;
-
-/**
- */
-public class SVNRepositoryUtil {
-
- /**
- * @param url
- * @param sourceName
- * @return SVNURL
- */
- public static SVNURL createSVNURL( String url,
- String sourceName ) {
-
- SVNURL theUrl;
- try {
- theUrl = SVNURL.parseURIDecoded(url);
- } catch (SVNException e) {
- // protocol not supported by this connector
- throw new RepositorySourceException(sourceName,
- "Protocol is not supported by this connector or there is problem in the svn url");
- }
- return theUrl;
- }
-
- public static void setNewSVNRepositoryLocation( SVNRepository oldRespository,
- String url,
- boolean forceReconnect,
- String sourceName ) {
- try {
- oldRespository.setLocation(createSVNURL(url, sourceName), forceReconnect);
- } catch (SVNException e) {
- throw new RepositorySourceException(sourceName, "the old url and a new one has got different protocols");
- }
- }
-
- /**
- * @param repository
- * @param path
- * @param revisionNumber
- * @param sourceName
- * @return SVNNodeKind
- */
- public static SVNNodeKind checkThePath( SVNRepository repository,
- String path,
- long revisionNumber,
- String sourceName ) {
- SVNNodeKind kind;
- try {
- kind = repository.checkPath(path, revisionNumber);
-
- } catch (SVNException e) {
- return null;
- }
- return kind;
- }
-
- /**
- * Create a {@link SVNRepository} from a http protocol.
- *
- * @param url - the url of the repository.
- * @param username - username credential.
- * @param password - password credential
- * @return {@link SVNRepository}.
- */
- public static SVNRepository createRepository( String url,
- String username,
- String password ) {
- // for DAV (over http and https)
- DAVRepositoryFactory.setup();
- // For File
- FSRepositoryFactory.setup();
- // for SVN (over svn and svn+ssh)
- SVNRepositoryFactoryImpl.setup();
-
- // The factory knows how to create a DAVRepository
- SVNRepository repository;
- try {
- repository = SVNRepositoryFactory.create(SVNURL.parseURIDecoded(url));
-
- ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(username, password);
- repository.setAuthenticationManager(authManager);
- } catch (SVNException e) {
- throw new InvalidWorkspaceException(SVNRepositoryConnectorI18n.workspaceDoesNotExist.text(e.getMessage()));
- }
- return repository;
- }
-
- /**
- * Util to get the last segment from a path.
- *
- * @param repository
- * @return last segment.
- */
- public static String getRepositoryWorspaceName( SVNRepository repository ) {
- String[] segments = repository.getLocation().getPath().split("/");
- return segments[segments.length - 1];
- }
-
- private SVNRepositoryUtil() {
- // prvent construction
- }
-
- /**
- * Check if the repository path exist.
- *
- * @param repos
- * @return true if repository exist and false otherwise.
- */
- public static boolean exist( SVNRepository repos ) {
- try {
- SVNNodeKind kind = repos.checkPath("", -1);
- if (kind == SVNNodeKind.NONE) {
- return false;
- }
- return true;
-
- } catch (SVNException e) {
- return false;
- }
- }
-
- /**
- * Check if repository path is a directory.
- *
- * @param repos
- * @param path
- * @return true if repository path is a directory and false otherwise.
- */
- public static boolean isDirectory( SVNRepository repos,
- String path ) {
- try {
- SVNNodeKind kind = repos.checkPath(path, -1);
- if (kind == SVNNodeKind.DIR) {
- return true;
- }
- } catch (SVNException e) {
- return false;
- }
- return false;
- }
-
- /**
- * @param repos
- * @param path
- * @return a collect of entry from directory path; never null
- */
- @SuppressWarnings( "unchecked" )
- public static Collection<SVNDirEntry> getDir( SVNRepository repos,
- String path ) {
- try {
- return repos.getDir(path, -1, null, (Collection<SVNDirEntry>)null);
- } catch (SVNException e) {
- return Collections.emptyList();
- }
- }
-
- /**
- * Check if the path is a file.
- *
- * @param repos
- * @param path
- * @return true if the path is a file and false otherwise.
- */
- public static boolean isFile( SVNRepository repos,
- String path ) {
- try {
- SVNNodeKind kind = repos.checkPath(path, -1);
- if (kind == SVNNodeKind.FILE) {
- return true;
- }
- } catch (SVNException e) {
- return false;
- }
- return false;
- }
-
- public static boolean exists( SVNRepository repository,
- String path ) throws SVNException{
- try {
- if (repository.checkPath(path, -1) == SVNNodeKind.NONE) {
- return false;
- } else if (repository.checkPath(path, -1) == SVNNodeKind.UNKNOWN) {
- return false;
- }
- } catch (SVNException e) {
- SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
- "unknow error during delete action: {0)",
- e.getMessage());
- throw new SVNException(err);
- }
- return true;
- }
-}
Deleted: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java 2010-01-04 20:58:21 UTC (rev 1523)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn/package-info.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -1,29 +0,0 @@
-/*
- * JBoss DNA (http://www.jboss.org/dna)
- * See the COPYRIGHT.txt file distributed with this work for information
- * regarding copyright ownership. Some portions may be licensed
- * to Red Hat, Inc. under one or more contributor license agreements.
- * See the AUTHORS.txt file in the distribution for a full listing of
- * individual contributors.
- *
- * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
- * is licensed to you under the terms of the GNU Lesser General Public License as
- * published by the Free Software Foundation; either version 2.1 of
- * the License, or (at your option) any later version.
- *
- * JBoss DNA is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- * Lesser General Public License for more details.
- *
- * You should have received a copy of the GNU Lesser General Public
- * License along with this software; if not, write to the Free
- * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
- * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
- */
-/**
- * The classes that make up the connector that accesses content from an SVN repository.
- */
-
-package org.jboss.dna.connector.svn;
-
Added: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnActionExecutor.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnActionExecutor.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnActionExecutor.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,73 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import org.jboss.dna.connector.scm.ScmAction;
+import org.jboss.dna.connector.scm.ScmActionExecutor;
+import org.tmatesoft.svn.core.SVNErrorCode;
+import org.tmatesoft.svn.core.SVNErrorMessage;
+import org.tmatesoft.svn.core.SVNException;
+import org.tmatesoft.svn.core.io.ISVNEditor;
+import org.tmatesoft.svn.core.io.SVNRepository;
+
+/**
+ */
+public class SvnActionExecutor implements ScmActionExecutor {
+
+ private final SVNRepository repository;
+
+ /**
+ * @param repository
+ */
+ public SvnActionExecutor( SVNRepository repository ) {
+ this.repository = repository;
+ }
+
+ /**
+ * @return repository
+ */
+ public SVNRepository getRepository() {
+ return repository;
+ }
+
+ /**
+ * @param action
+ * @param message
+ * @throws SVNException
+ */
+ public void execute( ScmAction action,
+ String message ) throws SVNException {
+ ISVNEditor editor = this.repository.getCommitEditor(message, null);
+ editor.openRoot(-1);
+ try {
+ action.applyAction(editor);
+ } catch (Exception e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN, "This error is appeared: '{0}'", e.getMessage());
+ throw new SVNException(err, e);
+ }
+ editor.closeDir();
+ editor.closeEdit();
+
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnActionExecutor.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepository.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepository.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepository.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,889 @@
+package org.jboss.dna.connector.svn2;
+
+import java.io.ByteArrayOutputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.UUID;
+import org.jboss.dna.common.i18n.I18n;
+import org.jboss.dna.connector.scm.ScmAction;
+import org.jboss.dna.connector.svn.mgnt.AddDirectory;
+import org.jboss.dna.connector.svn.mgnt.AddFile;
+import org.jboss.dna.connector.svn.mgnt.DeleteEntry;
+import org.jboss.dna.connector.svn.mgnt.UpdateFile;
+import org.jboss.dna.graph.DnaIntLexicon;
+import org.jboss.dna.graph.DnaLexicon;
+import org.jboss.dna.graph.ExecutionContext;
+import org.jboss.dna.graph.JcrLexicon;
+import org.jboss.dna.graph.JcrNtLexicon;
+import org.jboss.dna.graph.NodeConflictBehavior;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.connector.path.AbstractWritablePathWorkspace;
+import org.jboss.dna.graph.connector.path.DefaultPathNode;
+import org.jboss.dna.graph.connector.path.PathNode;
+import org.jboss.dna.graph.connector.path.WritablePathRepository;
+import org.jboss.dna.graph.connector.path.WritablePathWorkspace;
+import org.jboss.dna.graph.connector.path.cache.WorkspaceCache;
+import org.jboss.dna.graph.property.Binary;
+import org.jboss.dna.graph.property.BinaryFactory;
+import org.jboss.dna.graph.property.DateTimeFactory;
+import org.jboss.dna.graph.property.Name;
+import org.jboss.dna.graph.property.NameFactory;
+import org.jboss.dna.graph.property.NamespaceRegistry;
+import org.jboss.dna.graph.property.Path;
+import org.jboss.dna.graph.property.PathFactory;
+import org.jboss.dna.graph.property.Property;
+import org.jboss.dna.graph.property.PropertyFactory;
+import org.jboss.dna.graph.property.Path.Segment;
+import org.jboss.dna.graph.request.InvalidRequestException;
+import org.tmatesoft.svn.core.SVNDirEntry;
+import org.tmatesoft.svn.core.SVNErrorCode;
+import org.tmatesoft.svn.core.SVNErrorMessage;
+import org.tmatesoft.svn.core.SVNException;
+import org.tmatesoft.svn.core.SVNNodeKind;
+import org.tmatesoft.svn.core.SVNProperties;
+import org.tmatesoft.svn.core.SVNProperty;
+import org.tmatesoft.svn.core.SVNURL;
+import org.tmatesoft.svn.core.auth.ISVNAuthenticationManager;
+import org.tmatesoft.svn.core.internal.io.dav.DAVRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.fs.FSRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.svn.SVNRepositoryFactoryImpl;
+import org.tmatesoft.svn.core.io.SVNRepository;
+import org.tmatesoft.svn.core.io.SVNRepositoryFactory;
+import org.tmatesoft.svn.core.wc.SVNWCUtil;
+
+public class SvnRepository extends WritablePathRepository {
+
+ private static final String DEFAULT_MIME_TYPE = "application/octet-stream";
+ private static final byte[] EMPTY_BYTE_ARRAY = new byte[0];
+
+ private final SvnRepositorySource source;
+
+ static {
+ // for DAV (over http and https)
+ DAVRepositoryFactory.setup();
+ // For File
+ FSRepositoryFactory.setup();
+ // for SVN (over svn and svn+ssh)
+ SVNRepositoryFactoryImpl.setup();
+ }
+
+ public SvnRepository( SvnRepositorySource source ) {
+ super(source);
+
+ this.source = source;
+ initialize();
+ }
+
+ @Override
+ protected void initialize() {
+ ExecutionContext context = source.getRepositoryContext().getExecutionContext();
+ for (String workspaceName : source.getPredefinedWorkspaceNames()) {
+ doCreateWorkspace(context, workspaceName);
+ }
+
+ String defaultWorkspaceName = source.getDirectoryForDefaultWorkspace();
+ if (defaultWorkspaceName != null && !workspaces.containsKey(defaultWorkspaceName)) {
+ doCreateWorkspace(context, defaultWorkspaceName);
+ }
+
+ }
+
+ public WorkspaceCache getCache( String workspaceName ) {
+ return source.getPathRepositoryCache().getCache(workspaceName);
+ }
+
+ /**
+ * Internal method that creates a workspace and adds it to the map of active workspaces without checking to see if the source
+ * allows creating workspaces. This is useful when setting up predefined workspaces.
+ *
+ * @param context the current execution context; may not be null
+ * @param name the name of the workspace to create; may not be null
+ * @return the newly created workspace; never null
+ */
+ private WritablePathWorkspace doCreateWorkspace( ExecutionContext context,
+ String name ) {
+ SvnWorkspace workspace = new SvnWorkspace(name, source.getRootNodeUuid());
+
+ workspaces.putIfAbsent(name, workspace);
+ return (WritablePathWorkspace)workspaces.get(name);
+
+ }
+
+ @Override
+ protected WritablePathWorkspace createWorkspace( ExecutionContext context,
+ String name ) {
+ if (!source.isCreatingWorkspacesAllowed()) {
+ String msg = SvnRepositoryConnectorI18n.unableToCreateWorkspaces.text(getSourceName(), name);
+ throw new InvalidRequestException(msg);
+ }
+
+ return doCreateWorkspace(context, name);
+ }
+
+ class SvnWorkspace extends AbstractWritablePathWorkspace {
+
+ /**
+ * Only certain properties are tolerated when writing content (dna:resource or jcr:resource) nodes. These properties are
+ * implicitly stored (primary type, data) or silently ignored (encoded, mimetype, last modified). The silently ignored
+ * properties must be accepted to stay compatible with the JCR specification.
+ */
+ private final Set<Name> ALLOWABLE_PROPERTIES_FOR_CONTENT = Collections.unmodifiableSet(new HashSet<Name>(
+ Arrays.asList(new Name[] {
+ JcrLexicon.PRIMARY_TYPE,
+ JcrLexicon.DATA,
+ JcrLexicon.ENCODED,
+ JcrLexicon.MIMETYPE,
+ JcrLexicon.LAST_MODIFIED,
+ JcrLexicon.UUID,
+ DnaIntLexicon.NODE_DEFINITON})));
+ /**
+ * Only certain properties are tolerated when writing files (nt:file) or folders (nt:folder) nodes. These properties are
+ * implicitly stored in the file or folder (primary type, created).
+ */
+ private final Set<Name> ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER = Collections.unmodifiableSet(new HashSet<Name>(
+ Arrays.asList(new Name[] {
+ JcrLexicon.PRIMARY_TYPE,
+ JcrLexicon.CREATED,
+ JcrLexicon.UUID,
+ DnaIntLexicon.NODE_DEFINITON})));
+
+ private final SVNRepository workspaceRoot;
+
+ public SvnWorkspace( String name,
+ UUID rootNodeUuid ) {
+ super(name, rootNodeUuid);
+
+ try {
+ workspaceRoot = SVNRepositoryFactory.create(SVNURL.parseURIDecoded(name));
+
+ ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(source.getUsername(),
+ source.getPassword());
+ workspaceRoot.setAuthenticationManager(authManager);
+ } catch (SVNException ex) {
+ throw new IllegalStateException(ex);
+ }
+ }
+
+ public Path getLowestExistingPath( Path path ) {
+ do {
+ path = path.getParent();
+
+ if (getNode(path) != null) {
+ return path;
+ }
+ } while (path != null);
+
+ assert false : "workspace root path was not a valid path";
+ return null;
+ }
+
+ public PathNode getNode( Path path ) {
+ WorkspaceCache cache = getCache(getName());
+
+ PathNode node = cache.get(path);
+ if (node != null) return node;
+
+ ExecutionContext context = source.getRepositoryContext().getExecutionContext();
+ List<Property> properties = new LinkedList<Property>();
+ List<Segment> children = new LinkedList<Segment>();
+
+ try {
+ boolean result = readNode(context, this.getName(), path, properties, children);
+ if (!result) return null;
+ } catch (SVNException ex) {
+ return null;
+ }
+
+ UUID uuid = path.isRoot() ? source.getRootNodeUuid() : null;
+ node = new DefaultPathNode(path, uuid, properties, children);
+
+ cache.set(node);
+ return node;
+ }
+
+ public PathNode createNode( ExecutionContext context,
+ PathNode parentNode,
+ Name name,
+ Map<Name, Property> properties,
+ NodeConflictBehavior conflictBehavior ) {
+
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+ NameFactory nameFactory = context.getValueFactories().getNameFactory();
+ PathFactory pathFactory = context.getValueFactories().getPathFactory();
+
+ // New name to commit into the svn repos workspace
+ String newName = name.getString(registry);
+
+ Property primaryTypeProp = properties.get(JcrLexicon.PRIMARY_TYPE);
+ Name primaryType = primaryTypeProp == null ? null : nameFactory.create(primaryTypeProp.getFirstValue());
+
+ Path parentPath = parentNode.getPath();
+ String parentPathAsString = parentPath.getString(registry);
+ Path newPath = pathFactory.create(parentPath, name);
+
+ String newChildPath = null;
+
+ // File
+ if (JcrNtLexicon.FILE.equals(primaryType)) {
+ ensureValidProperties(context, properties.values(), ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER);
+ // Parent node already exist
+ boolean skipWrite = false;
+
+ if (parentPath.isRoot()) {
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = newName;
+ } else {
+ newChildPath = "/" + newName;
+ }
+ } else {
+ newChildPath = newPath.getString(registry);
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = newChildPath.substring(1);
+ }
+ }
+
+ // check if the new name already exist
+ try {
+ if (SvnRepositoryUtil.exists(workspaceRoot, newChildPath)) {
+ if (conflictBehavior.equals(NodeConflictBehavior.APPEND)) {
+ I18n msg = SvnRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
+ throw new InvalidRequestException(msg.text("SVN Connector does not support Same Name Sibling"));
+ } else if (conflictBehavior.equals(NodeConflictBehavior.DO_NOT_REPLACE)) {
+ skipWrite = true;
+ }
+ }
+ } catch (SVNException e1) {
+ throw new RepositorySourceException(getSourceName(), e1.getMessage());
+ }
+
+ // Don't try to write if the node conflict behavior is DO_NOT_REPLACE
+ if (!skipWrite) {
+ // create a new, empty file
+ if (newChildPath != null) {
+ try {
+ String rootPath = null;
+ if (parentPath.isRoot()) {
+ rootPath = "";
+ } else {
+ rootPath = parentPathAsString;
+ }
+ newFile(rootPath, newName, EMPTY_BYTE_ARRAY, null, getName(), workspaceRoot);
+ } catch (SVNException e) {
+ I18n msg = SvnRepositoryConnectorI18n.couldNotCreateFile;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ e.getMessage()), e);
+ }
+ }
+ }
+ } else if (JcrNtLexicon.RESOURCE.equals(primaryType) || DnaLexicon.RESOURCE.equals(primaryType)) { // Resource
+ ensureValidProperties(context, properties.values(), ALLOWABLE_PROPERTIES_FOR_CONTENT);
+ if (parentPath.isRoot()) {
+ newChildPath = parentPathAsString;
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = parentPathAsString.substring(1);
+ }
+ } else {
+ newChildPath = parentPathAsString;
+ if (!source.getRepositoryRootUrl().equals(getName())) {
+ newChildPath = newChildPath.substring(1);
+ }
+ }
+
+ if (!JcrLexicon.CONTENT.equals(name)) {
+ I18n msg = SvnRepositoryConnectorI18n.invalidNameForResource;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ newName));
+ }
+
+ Property parentPrimaryType = parentNode.getProperty(JcrLexicon.PRIMARY_TYPE);
+ Name parentPrimaryTypeName = parentPrimaryType == null ? null : nameFactory.create(parentPrimaryType.getFirstValue());
+ if (!JcrNtLexicon.FILE.equals(parentPrimaryTypeName)) {
+ I18n msg = SvnRepositoryConnectorI18n.invalidPathForResource;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString, getName(), getSourceName()));
+ }
+
+ boolean skipWrite = false;
+ if (conflictBehavior.equals(NodeConflictBehavior.APPEND)) {
+ I18n msg = SvnRepositoryConnectorI18n.sameNameSiblingsAreNotAllowed;
+ throw new InvalidRequestException(msg.text("SVN Connector does not support Same Name Sibling"));
+ } else if (conflictBehavior.equals(NodeConflictBehavior.DO_NOT_REPLACE)) {
+ // TODO check if the file already has content
+ skipWrite = true;
+ }
+
+ if (!skipWrite) {
+ Property dataProperty = properties.get(JcrLexicon.DATA);
+ if (dataProperty == null) {
+ I18n msg = SvnRepositoryConnectorI18n.missingRequiredProperty;
+ String dataPropName = JcrLexicon.DATA.getString(registry);
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ dataPropName));
+ }
+
+ BinaryFactory binaryFactory = context.getValueFactories().getBinaryFactory();
+ Binary binary = binaryFactory.create(properties.get(JcrLexicon.DATA).getFirstValue());
+ // get old data
+ ByteArrayOutputStream contents = new ByteArrayOutputStream();
+ SVNProperties svnProperties = new SVNProperties();
+ try {
+ workspaceRoot.getFile(newChildPath, -1, svnProperties, contents);
+ byte[] oldData = contents.toByteArray();
+
+ // modify the empty old data with the new resource
+ if (oldData != null) {
+ String pathToFile;
+ if (parentPath.isRoot()) {
+ pathToFile = "";
+ } else {
+ pathToFile = parentPath.getParent().getString(registry);
+ }
+ String fileName = parentPath.getLastSegment().getString(registry);
+
+ modifyFile(pathToFile, fileName, oldData, binary.getBytes(), null, getName(), workspaceRoot);
+ }
+ } catch (SVNException e) {
+ I18n msg = SvnRepositoryConnectorI18n.couldNotReadData;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ e.getMessage()), e);
+ }
+ }
+
+ } else if (JcrNtLexicon.FOLDER.equals(primaryType) || primaryType == null) { // Folder
+ ensureValidProperties(context, properties.values(), ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER);
+ try {
+ mkdir(parentPathAsString, newName, null, getName(), workspaceRoot);
+ } catch (SVNException e) {
+ I18n msg = SvnRepositoryConnectorI18n.couldNotCreateFile;
+ throw new RepositorySourceException(getSourceName(), msg.text(parentPathAsString,
+ getName(),
+ getSourceName(),
+ e.getMessage()), e);
+ }
+ } else {
+ I18n msg = SvnRepositoryConnectorI18n.unsupportedPrimaryType;
+ throw new RepositorySourceException(getSourceName(), msg.text(primaryType.getString(registry),
+ parentPathAsString,
+ getName(),
+ getSourceName()));
+ }
+
+ PathNode node = getNode(newPath);
+
+ List<Segment> newChildren = new ArrayList<Segment>(parentNode.getChildSegments().size() + 1);
+ newChildren.addAll(parentNode.getChildSegments());
+ newChildren.add(node.getPath().getLastSegment());
+
+ WorkspaceCache cache = getCache(getName());
+ cache.set(new DefaultPathNode(parentNode.getPath(), parentNode.getUuid(), parentNode.getProperties(), newChildren));
+ cache.set(node);
+
+ return node;
+ }
+
+ /**
+ * Create a directory .
+ *
+ * @param rootDirPath - the root directory where the created directory will reside
+ * @param childDirPath - the name of the created directory.
+ * @param comment - comment for the creation.
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException - if during the creation, there is an error.
+ */
+ private void mkdir( String rootDirPath,
+ String childDirPath,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+
+ String tempParentPath = rootDirPath;
+ if (!source.getRepositoryRootUrl().equals(inWorkspace)) {
+ if (!tempParentPath.equals("/") && tempParentPath.startsWith("/")) {
+ tempParentPath = tempParentPath.substring(1);
+ } else if (tempParentPath.equals("/")) {
+ tempParentPath = "";
+ }
+ }
+ String checkPath = tempParentPath.length() == 0 ? childDirPath : tempParentPath + "/" + childDirPath;
+ SVNNodeKind nodeKind = null;
+ try {
+ nodeKind = currentRepository.checkPath(checkPath, -1);
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "May be a Connecting problem to the repository or a user's authentication failure: {0}",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+
+ if (nodeKind != null && nodeKind == SVNNodeKind.NONE) {
+ ScmAction addNodeAction = new AddDirectory(rootDirPath, childDirPath);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "Create a new file " + childDirPath : comment;
+ executor.execute(addNodeAction, comment);
+ } else {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "Node with name '{0}' can't be created",
+ childDirPath);
+ throw new SVNException(err);
+ }
+ }
+
+ /**
+ * Create a file.
+ *
+ * @param rootDirPath
+ * @param childFilePath
+ * @param content
+ * @param comment
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException
+ */
+ private void newFile( String rootDirPath,
+ String childFilePath,
+ byte[] content,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+
+ String tempParentPath = rootDirPath;
+ if (!source.getRepositoryRootUrl().equals(inWorkspace)) {
+ if (!tempParentPath.equals("/") && tempParentPath.startsWith("/")) {
+ tempParentPath = tempParentPath.substring(1);
+ }
+ }
+ String checkPath = tempParentPath + "/" + childFilePath;
+ SVNNodeKind nodeKind = null;
+ try {
+ nodeKind = currentRepository.checkPath(checkPath, -1);
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "May be a Connecting problem to the repository or a user's authentication failure: {0}",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+
+ if (nodeKind != null && nodeKind == SVNNodeKind.NONE) {
+ ScmAction addFileNodeAction = new AddFile(rootDirPath, childFilePath, content);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "Create a new file " + childFilePath : comment;
+ executor.execute(addFileNodeAction, comment);
+ } else {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "Item with name '{0}' can't be created (already exist)",
+ childFilePath);
+ throw new SVNException(err);
+ }
+ }
+
+ /**
+ * Modify a file
+ *
+ * @param rootPath
+ * @param fileName
+ * @param oldData
+ * @param newData
+ * @param comment
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException
+ */
+ private void modifyFile( String rootPath,
+ String fileName,
+ byte[] oldData,
+ byte[] newData,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+ assert rootPath != null;
+ assert fileName != null;
+ assert oldData != null;
+ assert inWorkspace != null;
+ assert currentRepository != null;
+
+ try {
+
+ if (!source.getRepositoryRootUrl().equals(inWorkspace)) {
+ if (rootPath.equals("/")) {
+ rootPath = "";
+ } else {
+ rootPath = rootPath.substring(1) + "/";
+ }
+ } else {
+ if (!rootPath.equals("/")) {
+ rootPath = rootPath + "/";
+ }
+ }
+ String path = rootPath + fileName;
+
+ SVNNodeKind nodeKind = currentRepository.checkPath(path, -1);
+ if (nodeKind == SVNNodeKind.NONE || nodeKind == SVNNodeKind.UNKNOWN) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.ENTRY_NOT_FOUND,
+ "Item with name '{0}' can't be found",
+ path);
+ throw new SVNException(err);
+ }
+
+ ScmAction modifyFileAction = new UpdateFile(rootPath, fileName, oldData, newData);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "modify the " + fileName : comment;
+ executor.execute(modifyFileAction, comment);
+
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN, "This error is appeared: " + e.getMessage());
+ throw new SVNException(err, e);
+ }
+
+ }
+
+ /**
+ * Delete entry from the repository
+ *
+ * @param path
+ * @param comment
+ * @param inWorkspace
+ * @param currentRepository
+ * @throws SVNException
+ */
+ private void eraseEntry( String path,
+ String comment,
+ String inWorkspace,
+ SVNRepository currentRepository ) throws SVNException {
+ assert path != null;
+ assert inWorkspace != null;
+ if (path.equals("/") || path.equals("")) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.BAD_URL, "The root directory cannot be deleted");
+ throw new SVNException(err);
+ }
+
+ try {
+ ScmAction deleteEntryAction = new DeleteEntry(path);
+ SvnActionExecutor executor = new SvnActionExecutor(currentRepository);
+ comment = comment == null ? "Delete the " + path : comment;
+ executor.execute(deleteEntryAction, comment);
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "unknow error during delete action: {0)",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+ }
+
+ public boolean removeNode( ExecutionContext context,
+ Path nodePath ) {
+
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+
+ boolean isContentNode = !nodePath.isRoot() && JcrLexicon.CONTENT.equals(nodePath.getLastSegment().getName());
+ Path actualPath = isContentNode ? nodePath.getParent() : nodePath;
+
+ try {
+ SVNNodeKind kind = getNodeKind(context, actualPath, source.getRepositoryRootUrl());
+
+ if (kind == SVNNodeKind.NONE) {
+ return false;
+ }
+
+ if (isContentNode) {
+ String rootPath = actualPath.getParent().getString(registry);
+ String fileName = actualPath.getLastSegment().getString(registry);
+ modifyFile(rootPath, fileName, EMPTY_BYTE_ARRAY, EMPTY_BYTE_ARRAY, null, getName(), workspaceRoot);
+ } else {
+ eraseEntry(actualPath.getString(registry), null, getName(), workspaceRoot);
+ }
+ } catch (SVNException e) {
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.deleteFailed.text(nodePath, getSourceName()));
+ }
+
+ getCache(getName()).invalidate(nodePath);
+
+ return true;
+ }
+
+ public PathNode setProperties( ExecutionContext context,
+ Path nodePath,
+ Map<Name, Property> properties ) {
+ PathNode targetNode = getNode(nodePath);
+ if (targetNode == null) return null;
+
+ /*
+ * You can't really remove any properties from SVN nodes.
+ * You can clear the data of a dna:resource though
+ */
+
+ NameFactory nameFactory = context.getValueFactories().getNameFactory();
+ Property primaryTypeProperty = targetNode.getProperty(JcrLexicon.PRIMARY_TYPE);
+ Name primaryTypeName = primaryTypeProperty == null ? null : nameFactory.create(primaryTypeProperty.getFirstValue());
+ if (DnaLexicon.RESOURCE.equals(primaryTypeName)) {
+
+ for (Map.Entry<Name, Property> entry : properties.entrySet()) {
+ if (JcrLexicon.DATA.equals(entry.getKey())) {
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+ byte[] data;
+ if (entry.getValue() == null) {
+ data = EMPTY_BYTE_ARRAY;
+ } else {
+ BinaryFactory binaryFactory = context.getValueFactories().getBinaryFactory();
+ data = binaryFactory.create(entry.getValue().getFirstValue()).getBytes();
+
+ }
+
+ try {
+ Path actualPath = nodePath.getParent();
+ modifyFile(actualPath.getParent().getString(registry),
+ actualPath.getLastSegment().getString(registry),
+ EMPTY_BYTE_ARRAY,
+ data,
+ "",
+ getName(),
+ workspaceRoot);
+
+ PathNode node = getNode(nodePath);
+ getCache(getName()).set(node);
+
+ return node;
+ } catch (SVNException ex) {
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.deleteFailed.text(nodePath,
+ getSourceName()), ex);
+ }
+ }
+ }
+ }
+
+ return targetNode;
+ }
+
+ protected boolean readNode( ExecutionContext context,
+ String workspaceName,
+ Path requestedPath,
+ List<Property> properties,
+ List<Segment> children ) throws SVNException {
+ PathFactory pathFactory = context.getValueFactories().getPathFactory();
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+
+ if (requestedPath.isRoot()) {
+ // workspace root must be a directory
+ if (children != null) {
+ final Collection<SVNDirEntry> entries = SvnRepositoryUtil.getDir(workspaceRoot, "");
+ for (SVNDirEntry entry : entries) {
+ // All of the children of a directory will be another directory or a file, but never a "jcr:content" node
+ // ...
+ children.add(pathFactory.createSegment(entry.getName()));
+ }
+ }
+ // There are no properties on the root ...
+ } else {
+ // Generate the properties for this File object ...
+ PropertyFactory factory = context.getPropertyFactory();
+ DateTimeFactory dateFactory = context.getValueFactories().getDateFactory();
+
+ // Figure out the kind of node this represents ...
+ SVNNodeKind kind = getNodeKind(context, requestedPath, source.getRepositoryRootUrl());
+ if (kind == SVNNodeKind.NONE) {
+ // The node doesn't exist
+ return false;
+ }
+ if (kind == SVNNodeKind.DIR) {
+ String directoryPath = requestedPath.getString(registry);
+ if (!source.getRepositoryRootUrl().equals(workspaceName)) {
+ directoryPath = directoryPath.substring(1);
+ }
+ if (children != null) {
+ // Decide how to represent the children ...
+ Collection<SVNDirEntry> dirEntries = SvnRepositoryUtil.getDir(workspaceRoot, directoryPath);
+ for (SVNDirEntry entry : dirEntries) {
+ // All of the children of a directory will be another directory or a file,
+ // but never a "jcr:content" node ...
+ children.add(pathFactory.createSegment(entry.getName()));
+ }
+ }
+ if (properties != null) {
+ // Load the properties for this directory ......
+ properties.add(factory.create(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FOLDER));
+ SVNDirEntry entry = getEntryInfo(workspaceRoot, directoryPath);
+ if (entry != null) {
+ properties.add(factory.create(JcrLexicon.CREATED, dateFactory.create(entry.getDate())));
+ }
+ }
+ } else {
+ // It's not a directory, so must be a file; the only child of an nt:file is the "jcr:content" node
+ // ...
+ if (requestedPath.endsWith(JcrLexicon.CONTENT)) {
+ // There are never any children of these nodes, just properties ...
+ if (properties != null) {
+ String contentPath = requestedPath.getParent().getString(registry);
+ if (!source.getRepositoryRootUrl().equals(workspaceName)) {
+ contentPath = contentPath.substring(1);
+ }
+ SVNDirEntry entry = getEntryInfo(workspaceRoot, contentPath);
+ if (entry != null) {
+ // The request is to get properties of the "jcr:content" child node ...
+ // Do NOT use "nt:resource", since it extends "mix:referenceable". The JCR spec
+ // does not require that "jcr:content" is of type "nt:resource", but rather just
+ // suggests it. Therefore, we can use "dna:resource", which is identical to
+ // "nt:resource" except it does not extend "mix:referenceable"
+ properties.add(factory.create(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE));
+ properties.add(factory.create(JcrLexicon.LAST_MODIFIED, dateFactory.create(entry.getDate())));
+ }
+
+ ByteArrayOutputStream os = new ByteArrayOutputStream();
+ SVNProperties fileProperties = new SVNProperties();
+ getData(contentPath, fileProperties, os);
+ String mimeType = fileProperties.getStringValue(SVNProperty.MIME_TYPE);
+ if (mimeType == null) mimeType = DEFAULT_MIME_TYPE;
+ properties.add(factory.create(JcrLexicon.MIMETYPE, mimeType));
+
+ if (os.toByteArray().length > 0) {
+ // Now put the file's content into the "jcr:data" property ...
+ BinaryFactory binaryFactory = context.getValueFactories().getBinaryFactory();
+ properties.add(factory.create(JcrLexicon.DATA, binaryFactory.create(os.toByteArray())));
+ }
+ }
+ } else {
+ // Determine the corresponding file path for this object ...
+ String filePath = requestedPath.getString(registry);
+ if (!source.getRepositoryRootUrl().equals(workspaceName)) {
+ filePath = filePath.substring(1);
+ }
+ if (children != null) {
+ // Not a "jcr:content" child node but rather an nt:file node, so add the child ...
+ children.add(pathFactory.createSegment(JcrLexicon.CONTENT));
+ }
+ if (properties != null) {
+ // Now add the properties to "nt:file" ...
+ properties.add(factory.create(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE));
+ ByteArrayOutputStream os = new ByteArrayOutputStream();
+ SVNProperties fileProperties = new SVNProperties();
+ getData(filePath, fileProperties, os);
+ String created = fileProperties.getStringValue(SVNProperty.COMMITTED_DATE);
+ properties.add(factory.create(JcrLexicon.CREATED, dateFactory.create(created)));
+ }
+ }
+ }
+ }
+ return true;
+ }
+
+ /**
+ * Get some important informations of a path
+ *
+ * @param repos
+ * @param path - the path
+ * @return - the {@link SVNDirEntry}, or null if there is no such entry
+ */
+ protected SVNDirEntry getEntryInfo( SVNRepository repos,
+ String path ) {
+ assert path != null;
+ SVNDirEntry entry = null;
+ try {
+ entry = repos.info(path, -1);
+ } catch (SVNException e) {
+ throw new RepositorySourceException(
+ getSourceName(),
+ SvnRepositoryConnectorI18n.connectingFailureOrUserAuthenticationProblem.text(getSourceName()));
+ }
+ return entry;
+ }
+
+ /**
+ * Get the content of a file.
+ *
+ * @param path - the path to that file.
+ * @param properties - the properties of the file.
+ * @param os - the output stream where to store the content.
+ * @throws SVNException - throws if such path is not at that revision or in case of a connection problem.
+ */
+ protected void getData( String path,
+ SVNProperties properties,
+ OutputStream os ) throws SVNException {
+ workspaceRoot.getFile(path, -1, properties, os);
+
+ }
+
+ protected SVNNodeKind getNodeKind( ExecutionContext context,
+ Path path,
+ String repositoryRootUrl ) throws SVNException {
+ assert path != null;
+ assert repositoryRootUrl != null;
+
+ // See if the path is a "jcr:content" node ...
+ if (path.endsWith(JcrLexicon.CONTENT)) {
+ // We only want to use the parent path to find the actual file ...
+ path = path.getParent();
+ }
+ String pathAsString = path.getString(context.getNamespaceRegistry());
+ if (!repositoryRootUrl.equals(getName())) {
+ pathAsString = pathAsString.substring(1);
+ }
+
+ String absolutePath = pathAsString;
+ SVNNodeKind kind = workspaceRoot.checkPath(absolutePath, -1);
+ if (kind == SVNNodeKind.UNKNOWN) {
+ // node is unknown
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.nodeIsActuallyUnknow.text(pathAsString));
+ }
+ return kind;
+ }
+
+ protected SVNRepository getWorkspaceDirectory( String workspaceName ) {
+ if (workspaceName == null) workspaceName = source.getDirectoryForDefaultWorkspace();
+ SVNRepository repository = null;
+ SVNRepository repos = SvnRepositoryUtil.createRepository(workspaceName, source.getUsername(), source.getPassword());
+ if (SvnRepositoryUtil.isDirectory(repos, "")) {
+ repository = repos;
+ } else {
+ return null;
+ }
+ return repository;
+ }
+
+ /**
+ * Checks that the collection of {@code properties} only contains properties with allowable names.
+ *
+ * @param context
+ * @param properties
+ * @param validPropertyNames
+ * @throws RepositorySourceException if {@code properties} contains a
+ * @see #ALLOWABLE_PROPERTIES_FOR_CONTENT
+ * @see #ALLOWABLE_PROPERTIES_FOR_FILE_OR_FOLDER
+ */
+ protected void ensureValidProperties( ExecutionContext context,
+ Collection<Property> properties,
+ Set<Name> validPropertyNames ) {
+ List<String> invalidNames = new LinkedList<String>();
+ NamespaceRegistry registry = context.getNamespaceRegistry();
+
+ for (Property property : properties) {
+ if (!validPropertyNames.contains(property.getName())) {
+ invalidNames.add(property.getName().getString(registry));
+ }
+ }
+
+ if (!invalidNames.isEmpty()) {
+ throw new RepositorySourceException(getSourceName(),
+ SvnRepositoryConnectorI18n.invalidPropertyNames.text(invalidNames.toString()));
+ }
+ }
+
+ }
+
+}
Property changes on: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepository.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18n.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18n.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18n.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,89 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+* See the AUTHORS.txt file in the distribution for a full listing of
+* individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import java.util.Locale;
+import java.util.Set;
+import org.jboss.dna.common.i18n.I18n;
+
+/**
+ * The internationalized string constants for the <code>org.jboss.dna.connector.svn*</code> packages.
+ */
+public final class SvnRepositoryConnectorI18n {
+
+ public static I18n connectorName;
+ public static I18n nodeDoesNotExist;
+ public static I18n nodeIsActuallyUnknow;
+ public static I18n propertyIsRequired;
+ public static I18n errorSerializingCachePolicyInSource;
+ public static I18n locationInRequestMustHavePath;
+ public static I18n sourceIsReadOnly;
+ public static I18n sourceDoesNotSupportCreatingWorkspaces;
+ public static I18n sourceDoesNotSupportCloningWorkspaces;
+ public static I18n sourceDoesNotSupportDeletingWorkspaces;
+ public static I18n connectingFailureOrUserAuthenticationProblem;
+ public static I18n pathForPredefinedWorkspaceDoesNotExist;
+ public static I18n pathForPredefinedWorkspaceIsNotDirectory;
+ public static I18n pathForPredefinedWorkspaceCannotBeRead;
+ public static I18n workspaceDoesNotExist;
+ public static I18n pathForDefaultWorkspaceDoesNotExist;
+ public static I18n pathForDefaultWorkspaceIsNotDirectory;
+ public static I18n pathForDefaultWorkspaceCannotBeRead;
+ public static I18n sameNameSiblingsAreNotAllowed;
+ public static I18n onlyTheDefaultNamespaceIsAllowed;
+ public static I18n unableToCreateWorkspaces;
+ public static I18n pathForRequestIsNotCorrect;
+ public static I18n pathForRequestMustStartWithAForwardSlash;
+ public static I18n nodeAlreadyExist;
+ public static I18n unsupportedPrimaryType;
+ public static I18n invalidPropertyNames;
+ public static I18n invalidNameForResource;
+ public static I18n invalidPathForResource;
+ public static I18n missingRequiredProperty;
+ public static I18n couldNotCreateFile;
+ public static I18n couldNotReadData;
+ public static I18n deleteFailed;
+
+ static {
+ try {
+ I18n.initialize(SvnRepositoryConnectorI18n.class);
+ } catch (final Exception err) {
+ System.err.println(err);
+ }
+ }
+
+ public static Set<Locale> getLocalizationProblemLocales() {
+ return I18n.getLocalizationProblemLocales(SvnRepositoryConnectorI18n.class);
+ }
+
+ public static Set<String> getLocalizationProblems() {
+ return I18n.getLocalizationProblems(SvnRepositoryConnectorI18n.class);
+ }
+
+ public static Set<String> getLocalizationProblems( Locale locale ) {
+ return I18n.getLocalizationProblems(SvnRepositoryConnectorI18n.class, locale);
+ }
+
+
+}
Property changes on: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18n.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryLexicon.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryLexicon.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryLexicon.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,43 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import org.jboss.dna.connector.svn2.SvnRepositorySource;
+import org.jboss.dna.graph.property.Name;
+import org.jboss.dna.graph.property.basic.BasicName;
+
+/**
+ * The namespace and property names used within a {@link SvnRepositorySource} to store internal information.
+ */
+public class SvnRepositoryLexicon {
+
+ public static class Namespace {
+ public static final String URI = "http://www.jboss.org/dna/connector/svn";
+ public static final String PREFIX = "dnasvn";
+ }
+
+ public static final Name CHILD_PATH_SEGMENT_LIST = new BasicName(Namespace.URI, "orderedChildNames");
+ public static final Name UUID = new BasicName(Namespace.URI, "uuid");
+
+}
Property changes on: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryLexicon.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositorySource.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositorySource.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositorySource.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,408 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import java.util.Hashtable;
+import java.util.List;
+import java.util.Map;
+import javax.naming.Context;
+import javax.naming.Name;
+import javax.naming.Reference;
+import javax.naming.StringRefAddr;
+import javax.naming.spi.ObjectFactory;
+import net.jcip.annotations.ThreadSafe;
+import org.jboss.dna.common.i18n.I18n;
+import org.jboss.dna.common.util.CheckArg;
+import org.jboss.dna.common.util.StringUtil;
+import org.jboss.dna.graph.connector.RepositoryConnection;
+import org.jboss.dna.graph.connector.RepositorySource;
+import org.jboss.dna.graph.connector.RepositorySourceCapabilities;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.connector.path.AbstractPathRepositorySource;
+import org.jboss.dna.graph.connector.path.PathRepositoryConnection;
+
+/**
+ * The {@link RepositorySource} for the connector that exposes an area of the local/remote svn repository as content in a
+ * repository. This source considers a workspace name to be the path to the directory on the repository's root directory location
+ * that represents the root of that workspace. New workspaces can be created, as long as the names represent valid paths to
+ * existing directories.
+ */
+@ThreadSafe
+public class SvnRepositorySource extends AbstractPathRepositorySource implements ObjectFactory {
+
+ /**
+ * The first serialized version of this source. Version {@value} .
+ */
+ private static final long serialVersionUID = 1L;
+
+ protected static final String SOURCE_NAME = "sourceName";
+ protected static final String SVN_REPOSITORY_ROOT_URL = "repositoryRootURL";
+ protected static final String SVN_USERNAME = "username";
+ protected static final String SVN_PASSWORD = "password";
+ protected static final String CACHE_TIME_TO_LIVE_IN_MILLISECONDS = "cacheTimeToLiveInMilliseconds";
+ protected static final String RETRY_LIMIT = "retryLimit";
+ protected static final String ROOT_NODE_UUID = "rootNodeUuid";
+ protected static final String DEFAULT_WORKSPACE = "defaultWorkspace";
+ protected static final String PREDEFINED_WORKSPACE_NAMES = "predefinedWorkspaceNames";
+ protected static final String ALLOW_CREATING_WORKSPACES = "allowCreatingWorkspaces";
+
+ /**
+ * This source supports events.
+ */
+ protected static final boolean SUPPORTS_EVENTS = true;
+ /**
+ * This source supports same-name-siblings.
+ */
+ protected static final boolean SUPPORTS_SAME_NAME_SIBLINGS = false;
+ /**
+ * This source does support creating workspaces.
+ */
+ protected static final boolean DEFAULT_SUPPORTS_CREATING_WORKSPACES = true;
+ /**
+ * This source supports udpates by default, but each instance may be configured to be read-only or updateable}.
+ */
+ public static final boolean DEFAULT_SUPPORTS_UPDATES = false;
+
+ /**
+ * This source supports creating references.
+ */
+ protected static final boolean SUPPORTS_REFERENCES = false;
+
+ private volatile String repositoryRootUrl;
+ private volatile String username;
+ private volatile String password;
+ private volatile String defaultWorkspace;
+ private volatile String[] predefinedWorkspaces = new String[] {};
+ private volatile RepositorySourceCapabilities capabilities = new RepositorySourceCapabilities(
+ SUPPORTS_SAME_NAME_SIBLINGS,
+ DEFAULT_SUPPORTS_UPDATES,
+ SUPPORTS_EVENTS,
+ DEFAULT_SUPPORTS_CREATING_WORKSPACES,
+ SUPPORTS_REFERENCES);
+
+ private transient SvnRepository repository;
+
+ /**
+ * Create a repository source instance.
+ */
+ public SvnRepositorySource() {
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.RepositorySource#getCapabilities()
+ */
+ public RepositorySourceCapabilities getCapabilities() {
+ return capabilities;
+ }
+
+ /**
+ * @return the url
+ */
+ public String getRepositoryRootUrl() {
+ return this.repositoryRootUrl;
+ }
+
+ /**
+ * Set the url for the subversion repository.
+ *
+ * @param url - the url location.
+ * @throws IllegalArgumentException If svn url is null or empty
+ */
+ public synchronized void setRepositoryRootUrl( String url ) {
+ CheckArg.isNotEmpty(url, "RepositoryRootUrl");
+ this.repositoryRootUrl = url;
+ }
+
+ public String getUsername() {
+ return this.username;
+ }
+
+ /**
+ * @param username
+ */
+ public synchronized void setUsername( String username ) {
+ this.username = username;
+ }
+
+ public String getPassword() {
+ return this.password;
+ }
+
+ /**
+ * @param password
+ */
+ public synchronized void setPassword( String password ) {
+ this.password = password;
+ }
+
+ /**
+ * Get whether this source supports updates.
+ *
+ * @return true if this source supports updates, or false if this source only supports reading content.
+ */
+ public boolean getSupportsUpdates() {
+ return capabilities.supportsUpdates();
+ }
+
+ /**
+ * Get the file system path to the existing directory that should be used for the default workspace. If the default is
+ * specified as a null String or is not a valid and resolvable path, this source will consider the default to be the current
+ * working directory of this virtual machine, as defined by the <code>new File(".")</code>.
+ *
+ * @return the file system path to the directory representing the default workspace, or null if the default should be the
+ * current working directory
+ */
+ public String getDirectoryForDefaultWorkspace() {
+ return defaultWorkspace;
+ }
+
+ public String getDefaultWorkspaceName() {
+ return defaultWorkspace;
+ }
+
+ /**
+ * Set the file system path to the existing directory that should be used for the default workspace. If the default is
+ * specified as a null String or is not a valid and resolvable path, this source will consider the default to be the current
+ * working directory of this virtual machine, as defined by the <code>new File(".")</code>.
+ *
+ * @param pathToDirectoryForDefaultWorkspace the valid and resolvable file system path to the directory representing the
+ * default workspace, or null if the current working directory should be used as the default workspace
+ */
+ public synchronized void setDirectoryForDefaultWorkspace( String pathToDirectoryForDefaultWorkspace ) {
+ this.defaultWorkspace = pathToDirectoryForDefaultWorkspace;
+ }
+
+ /**
+ * Gets the names of the workspaces that are available when this source is created. Each workspace name corresponds to a path
+ * to a directory on the file system.
+ *
+ * @return the names of the workspaces that this source starts with, or null if there are no such workspaces
+ * @see #setPredefinedWorkspaceNames(String[])
+ * @see #setCreatingWorkspacesAllowed(boolean)
+ */
+ public synchronized String[] getPredefinedWorkspaceNames() {
+ String[] copy = new String[predefinedWorkspaces.length];
+ System.arraycopy(predefinedWorkspaces, 0, copy, 0, predefinedWorkspaces.length);
+ return copy;
+ }
+
+ /**
+ * Sets the names of the workspaces that are available when this source is created. Each workspace name corresponds to a path
+ * to a directory on the file system.
+ *
+ * @param predefinedWorkspaceNames the names of the workspaces that this source should start with, or null if there are no
+ * such workspaces
+ * @see #setCreatingWorkspacesAllowed(boolean)
+ * @see #getPredefinedWorkspaceNames()
+ */
+ public synchronized void setPredefinedWorkspaceNames( String[] predefinedWorkspaceNames ) {
+ this.predefinedWorkspaces = predefinedWorkspaceNames;
+ }
+
+ /**
+ * Get whether this source allows workspaces to be created dynamically.
+ *
+ * @return true if this source allows workspaces to be created by clients, or false if the set of workspaces is fixed
+ * @see #setPredefinedWorkspaceNames(String[])
+ * @see #getPredefinedWorkspaceNames()
+ * @see #setCreatingWorkspacesAllowed(boolean)
+ */
+ public boolean isCreatingWorkspacesAllowed() {
+ return capabilities.supportsCreatingWorkspaces();
+ }
+
+ /**
+ * Set whether this source allows workspaces to be created dynamically.
+ *
+ * @param allowWorkspaceCreation true if this source allows workspaces to be created by clients, or false if the set of
+ * workspaces is fixed
+ * @see #setPredefinedWorkspaceNames(String[])
+ * @see #getPredefinedWorkspaceNames()
+ * @see #isCreatingWorkspacesAllowed()
+ */
+ public synchronized void setCreatingWorkspacesAllowed( boolean allowWorkspaceCreation ) {
+ capabilities = new RepositorySourceCapabilities(capabilities.supportsSameNameSiblings(), capabilities.supportsUpdates(),
+ capabilities.supportsEvents(), allowWorkspaceCreation,
+ capabilities.supportsReferences());
+ }
+
+ /**
+ * Get whether this source allows updates.
+ *
+ * @return true if this source allows updates by clients, or false if no updates are allowed
+ * @see #setUpdatesAllowed(boolean)
+ */
+ @Override
+ public boolean areUpdatesAllowed() {
+ return capabilities.supportsUpdates();
+ }
+
+ /**
+ * Set whether this source allows updates to data within workspaces
+ *
+ * @param allowUpdates true if this source allows updates to data within workspaces clients, or false if updates are not
+ * allowed.
+ * @see #areUpdatesAllowed()
+ */
+ public synchronized void setUpdatesAllowed( boolean allowUpdates ) {
+ capabilities = new RepositorySourceCapabilities(capabilities.supportsSameNameSiblings(), allowUpdates,
+ capabilities.supportsEvents(), capabilities.supportsCreatingWorkspaces(),
+ capabilities.supportsReferences());
+ }
+
+ /**
+ * {@inheritDoc}
+ */
+ @Override
+ public boolean equals( Object obj ) {
+ if (obj == this) return true;
+ if (obj instanceof SvnRepositorySource) {
+ SvnRepositorySource that = (SvnRepositorySource)obj;
+ if (this.getName() == null) {
+ if (that.getName() != null) return false;
+ } else {
+ if (!this.getName().equals(that.getName())) return false;
+ }
+ return true;
+ }
+ return false;
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see javax.naming.Referenceable#getReference()
+ */
+ public synchronized Reference getReference() {
+ String className = getClass().getName();
+ String factoryClassName = this.getClass().getName();
+ Reference ref = new Reference(className, factoryClassName, null);
+
+ if (getName() != null) {
+ ref.add(new StringRefAddr(SOURCE_NAME, getName()));
+ }
+ if (getRepositoryRootUrl() != null) {
+ ref.add(new StringRefAddr(SVN_REPOSITORY_ROOT_URL, getRepositoryRootUrl()));
+ }
+ if (getUsername() != null) {
+ ref.add(new StringRefAddr(SVN_USERNAME, getUsername()));
+ }
+ if (getPassword() != null) {
+ ref.add(new StringRefAddr(SVN_PASSWORD, getPassword()));
+ }
+ ref.add(new StringRefAddr(RETRY_LIMIT, Integer.toString(getRetryLimit())));
+ ref.add(new StringRefAddr(ROOT_NODE_UUID, rootNodeUuid.toString()));
+ ref.add(new StringRefAddr(DEFAULT_WORKSPACE, getDirectoryForDefaultWorkspace()));
+ ref.add(new StringRefAddr(ALLOW_CREATING_WORKSPACES, Boolean.toString(isCreatingWorkspacesAllowed())));
+ String[] workspaceNames = getPredefinedWorkspaceNames();
+ if (workspaceNames != null && workspaceNames.length != 0) {
+ ref.add(new StringRefAddr(PREDEFINED_WORKSPACE_NAMES, StringUtil.combineLines(workspaceNames)));
+ }
+ return ref;
+
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see javax.naming.spi.ObjectFactory#getObjectInstance(java.lang.Object, javax.naming.Name, javax.naming.Context,
+ * java.util.Hashtable)
+ */
+ public Object getObjectInstance( Object obj,
+ Name name,
+ Context nameCtx,
+ Hashtable<?, ?> environment ) throws Exception {
+ if (!(obj instanceof Reference)) return null;
+
+ Map<String, Object> values = valuesFrom((Reference)obj);
+
+ String sourceName = (String)values.get(SOURCE_NAME);
+ String repositoryRootUrl = (String)values.get(SVN_REPOSITORY_ROOT_URL);
+ String username = (String)values.get(SVN_USERNAME);
+ String password = (String)values.get(SVN_PASSWORD);
+ String retryLimit = (String)values.get(RETRY_LIMIT);
+ String rootNodeUuid = (String)values.get(ROOT_NODE_UUID);
+ String defaultWorkspace = (String)values.get(DEFAULT_WORKSPACE);
+ String createWorkspaces = (String)values.get(ALLOW_CREATING_WORKSPACES);
+
+ String combinedWorkspaceNames = (String)values.get(PREDEFINED_WORKSPACE_NAMES);
+ String[] workspaceNames = null;
+ if (combinedWorkspaceNames != null) {
+ List<String> paths = StringUtil.splitLines(combinedWorkspaceNames);
+ workspaceNames = paths.toArray(new String[paths.size()]);
+ }
+ // Create the source instance ...
+ SvnRepositorySource source = new SvnRepositorySource();
+ if (sourceName != null) source.setName(sourceName);
+ if (repositoryRootUrl != null && repositoryRootUrl.length() > 0) source.setRepositoryRootUrl(repositoryRootUrl);
+ if (username != null) source.setUsername(username);
+ if (password != null) source.setPassword(password);
+ if (retryLimit != null) source.setRetryLimit(Integer.parseInt(retryLimit));
+ if (rootNodeUuid != null) source.setRootNodeUuid(rootNodeUuid);
+ if (defaultWorkspace != null) source.setDirectoryForDefaultWorkspace(defaultWorkspace);
+ if (createWorkspaces != null) source.setCreatingWorkspacesAllowed(Boolean.parseBoolean(createWorkspaces));
+ if (workspaceNames != null && workspaceNames.length != 0) source.setPredefinedWorkspaceNames(workspaceNames);
+ return source;
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.RepositorySource#getConnection()
+ */
+ public synchronized RepositoryConnection getConnection() throws RepositorySourceException {
+
+ String sourceName = getName();
+ if (sourceName == null || sourceName.trim().length() == 0) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getName(), msg.text("name"));
+ }
+
+ String sourceUsername = getUsername();
+ if (sourceUsername == null || sourceUsername.trim().length() == 0) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getUsername(), msg.text("username"));
+ }
+
+ String sourcePassword = getPassword();
+ if (sourcePassword == null) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getPassword(), msg.text("password"));
+ }
+
+ String repositoryRootURL = getRepositoryRootUrl();
+ if (repositoryRootURL == null || repositoryRootURL.trim().length() == 0) {
+ I18n msg = SvnRepositoryConnectorI18n.propertyIsRequired;
+ throw new RepositorySourceException(getRepositoryRootUrl(), msg.text("repositoryRootURL"));
+ }
+
+ if (this.repository == null) {
+ this.repository = new SvnRepository(this);
+ }
+
+ return new PathRepositoryConnection(this, this.repository);
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositorySource.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryUtil.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryUtil.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryUtil.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,235 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import java.util.Collection;
+import java.util.Collections;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.request.InvalidWorkspaceException;
+import org.tmatesoft.svn.core.SVNDirEntry;
+import org.tmatesoft.svn.core.SVNErrorCode;
+import org.tmatesoft.svn.core.SVNErrorMessage;
+import org.tmatesoft.svn.core.SVNException;
+import org.tmatesoft.svn.core.SVNNodeKind;
+import org.tmatesoft.svn.core.SVNURL;
+import org.tmatesoft.svn.core.auth.ISVNAuthenticationManager;
+import org.tmatesoft.svn.core.internal.io.dav.DAVRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.fs.FSRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.svn.SVNRepositoryFactoryImpl;
+import org.tmatesoft.svn.core.io.SVNRepository;
+import org.tmatesoft.svn.core.io.SVNRepositoryFactory;
+import org.tmatesoft.svn.core.wc.SVNWCUtil;
+
+/**
+ */
+public class SvnRepositoryUtil {
+
+ /**
+ * @param url
+ * @param sourceName
+ * @return SVNURL
+ */
+ public static SVNURL createSVNURL( String url,
+ String sourceName ) {
+
+ SVNURL theUrl;
+ try {
+ theUrl = SVNURL.parseURIDecoded(url);
+ } catch (SVNException e) {
+ // protocol not supported by this connector
+ throw new RepositorySourceException(sourceName,
+ "Protocol is not supported by this connector or there is problem in the svn url");
+ }
+ return theUrl;
+ }
+
+ public static void setNewSVNRepositoryLocation( SVNRepository oldRespository,
+ String url,
+ boolean forceReconnect,
+ String sourceName ) {
+ try {
+ oldRespository.setLocation(createSVNURL(url, sourceName), forceReconnect);
+ } catch (SVNException e) {
+ throw new RepositorySourceException(sourceName, "the old url and a new one has got different protocols");
+ }
+ }
+
+ /**
+ * @param repository
+ * @param path
+ * @param revisionNumber
+ * @param sourceName
+ * @return SVNNodeKind
+ */
+ public static SVNNodeKind checkThePath( SVNRepository repository,
+ String path,
+ long revisionNumber,
+ String sourceName ) {
+ SVNNodeKind kind;
+ try {
+ kind = repository.checkPath(path, revisionNumber);
+
+ } catch (SVNException e) {
+ return null;
+ }
+ return kind;
+ }
+
+ /**
+ * Create a {@link SVNRepository} from a http protocol.
+ *
+ * @param url - the url of the repository.
+ * @param username - username credential.
+ * @param password - password credential
+ * @return {@link SVNRepository}.
+ */
+ public static SVNRepository createRepository( String url,
+ String username,
+ String password ) {
+ // for DAV (over http and https)
+ DAVRepositoryFactory.setup();
+ // For File
+ FSRepositoryFactory.setup();
+ // for SVN (over svn and svn+ssh)
+ SVNRepositoryFactoryImpl.setup();
+
+ // The factory knows how to create a DAVRepository
+ SVNRepository repository;
+ try {
+ repository = SVNRepositoryFactory.create(SVNURL.parseURIDecoded(url));
+
+ ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(username, password);
+ repository.setAuthenticationManager(authManager);
+ } catch (SVNException e) {
+ throw new InvalidWorkspaceException(SvnRepositoryConnectorI18n.workspaceDoesNotExist.text(e.getMessage()));
+ }
+ return repository;
+ }
+
+ /**
+ * Util to get the last segment from a path.
+ *
+ * @param repository
+ * @return last segment.
+ */
+ public static String getRepositoryWorspaceName( SVNRepository repository ) {
+ String[] segments = repository.getLocation().getPath().split("/");
+ return segments[segments.length - 1];
+ }
+
+ private SvnRepositoryUtil() {
+ // prvent construction
+ }
+
+ /**
+ * Check if the repository path exist.
+ *
+ * @param repos
+ * @return true if repository exist and false otherwise.
+ */
+ public static boolean exist( SVNRepository repos ) {
+ try {
+ SVNNodeKind kind = repos.checkPath("", -1);
+ if (kind == SVNNodeKind.NONE) {
+ return false;
+ }
+ return true;
+
+ } catch (SVNException e) {
+ return false;
+ }
+ }
+
+ /**
+ * Check if repository path is a directory.
+ *
+ * @param repos
+ * @param path
+ * @return true if repository path is a directory and false otherwise.
+ */
+ public static boolean isDirectory( SVNRepository repos,
+ String path ) {
+ try {
+ SVNNodeKind kind = repos.checkPath(path, -1);
+ if (kind == SVNNodeKind.DIR) {
+ return true;
+ }
+ } catch (SVNException e) {
+ return false;
+ }
+ return false;
+ }
+
+ /**
+ * @param repos
+ * @param path
+ * @return a collect of entry from directory path; never null
+ */
+ @SuppressWarnings( "unchecked" )
+ public static Collection<SVNDirEntry> getDir( SVNRepository repos,
+ String path ) {
+ try {
+ return repos.getDir(path, -1, null, (Collection<SVNDirEntry>)null);
+ } catch (SVNException e) {
+ return Collections.emptyList();
+ }
+ }
+
+ /**
+ * Check if the path is a file.
+ *
+ * @param repos
+ * @param path
+ * @return true if the path is a file and false otherwise.
+ */
+ public static boolean isFile( SVNRepository repos,
+ String path ) {
+ try {
+ SVNNodeKind kind = repos.checkPath(path, -1);
+ if (kind == SVNNodeKind.FILE) {
+ return true;
+ }
+ } catch (SVNException e) {
+ return false;
+ }
+ return false;
+ }
+
+ public static boolean exists( SVNRepository repository,
+ String path ) throws SVNException{
+ try {
+ if (repository.checkPath(path, -1) == SVNNodeKind.NONE) {
+ return false;
+ } else if (repository.checkPath(path, -1) == SVNNodeKind.UNKNOWN) {
+ return false;
+ }
+ } catch (SVNException e) {
+ SVNErrorMessage err = SVNErrorMessage.create(SVNErrorCode.UNKNOWN,
+ "unknow error during delete action: {0)",
+ e.getMessage());
+ throw new SVNException(err);
+ }
+ return true;
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/SvnRepositoryUtil.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/package-info.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/package-info.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/package-info.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,29 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+/**
+ * The classes that make up the connector that accesses content from an SVN repository.
+ */
+
+package org.jboss.dna.connector.svn2;
+
Property changes on: trunk/extensions/dna-connector-svn/src/main/java/org/jboss/dna/connector/svn2/package-info.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Copied: trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn2 (from rev 1520, trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn)
Modified: trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn2/SVNRepositoryConnectorI18n.properties
===================================================================
--- trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn/SVNRepositoryConnectorI18n.properties 2010-01-04 15:33:19 UTC (rev 1520)
+++ trunk/extensions/dna-connector-svn/src/main/resources/org/jboss/dna/connector/svn2/SVNRepositoryConnectorI18n.properties 2010-01-05 12:43:25 UTC (rev 1524)
@@ -54,6 +54,6 @@
# Writable tests
-couldNotCreateFile =Error reading data in workspace "{1}" "{0}" "{2}" "{3}"
+couldNotCreateFile =Error reading data at path "{0}" in workspace "{1}" in source "{2}": "{3}"
couldNotReadData= Error reading data in workspace "{1}" "{0}" "{2}" "{3}"
deleteFailed=Error deleting path {0} in workspace with source name {1}
\ No newline at end of file
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnConnectorTestUtil.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnConnectorTestUtil.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnConnectorTestUtil.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,177 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import java.io.File;
+import java.io.IOException;
+import java.util.Collection;
+import java.util.Iterator;
+import org.jboss.dna.common.util.FileUtil;
+import org.tmatesoft.svn.core.SVNDirEntry;
+import org.tmatesoft.svn.core.SVNException;
+import org.tmatesoft.svn.core.SVNNodeKind;
+import org.tmatesoft.svn.core.SVNURL;
+import org.tmatesoft.svn.core.auth.ISVNAuthenticationManager;
+import org.tmatesoft.svn.core.internal.io.dav.DAVRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.fs.FSRepositoryFactory;
+import org.tmatesoft.svn.core.internal.io.svn.SVNRepositoryFactoryImpl;
+import org.tmatesoft.svn.core.io.SVNRepository;
+import org.tmatesoft.svn.core.io.SVNRepositoryFactory;
+import org.tmatesoft.svn.core.wc.SVNWCUtil;
+
+/**
+ * @author Serge Pagop
+ */
+public class SvnConnectorTestUtil {
+
+ @SuppressWarnings( "unchecked" )
+ public static void main( String[] args ) throws Exception {
+ try {
+ System.out.println("My repos. ......");
+ String svnUrl = SvnConnectorTestUtil.createURL("src/test/resources/dummy_svn_repos", "target/copy_of dummy_svn_repos");
+ String username = "sp";
+ String password = "";
+ System.out.println(svnUrl);
+ SVNRepository trunkWorkspace = createRepository(svnUrl + "/trunk", username, password);
+ System.out.println("Repository location: " + trunkWorkspace.getLocation().toString());
+ System.out.println("Repository Root: " + trunkWorkspace.getRepositoryRoot(true));
+ System.out.println("Repository UUID: " + trunkWorkspace.getRepositoryUUID(true));
+ /**
+ * Returns the repository location to which this object is set. It may be the location that was used to create this
+ * object (see {@link SVNRepositoryFactory#create(SVNURL)}), or the recent one the object was set to.
+ */
+ System.out.println("location: " + trunkWorkspace.getLocation().getPath());
+ System.out.println("decoded location: " + trunkWorkspace.getLocation().toDecodedString());
+ System.out.println("last seg: " + getRepositoryWorspaceName(trunkWorkspace));
+
+ final Collection<SVNDirEntry> dirEntries = trunkWorkspace.getDir("", -1, null, (Collection<SVNDirEntry>)null);
+ for (SVNDirEntry dirEntry : dirEntries) {
+ System.out.println("name: " + dirEntry.getName());
+ }
+
+ // //
+ // SVNNodeKind nodeKind = trunkWorkspace.checkPath( "/" , -1 );
+ // if ( nodeKind == SVNNodeKind.NONE ) {
+ // System.err.println( "There is no entry in the workspace "+ trunkWorkspace );
+ // System.exit( 1 );
+ // } else if ( nodeKind == SVNNodeKind.FILE ) {
+ // System.err.println( "The entry at '" + trunkWorkspace + "' is a file while a directory was expected." );
+ // System.exit( 1 );
+ // } else {
+ // listEntries(trunkWorkspace, "/root");
+ // // long latestRevision = trunkWorkspace.getLatestRevision( );
+ // // System.out.println( "workspace latest revision: " + latestRevision );
+ //
+ // //// SVNNodeKind kind = trunkWorkspace.checkPath("/", -1);
+ // // if(kind == SVNNodeKind.NONE) {
+ // // System.out.println("none");
+ // // } else if(kind == SVNNodeKind.UNKNOWN) {
+ // // System.out.println("unknown");
+ // // } else if(kind == SVNNodeKind.FILE) {
+ // // System.out.println("file");
+ // // } else if(kind == SVNNodeKind.DIR) {
+ // System.out.println("dir");
+ // // listEntries(trunkWorkspace,"root");
+ // }
+
+ } catch (SVNException e) {
+ e.printStackTrace();
+ }
+ }
+
+ /**
+ * Create a {@link SVNRepository} from a http protocol.
+ *
+ * @param url - the url of the repository.
+ * @param username - username credential.
+ * @param password - password credential
+ * @return {@link SVNRepository}.
+ * @throws SVNException - when error situation.
+ */
+ public static SVNRepository createRepository( String url,
+ String username,
+ String password ) throws SVNException {
+ // for DAV (over http and https)
+ DAVRepositoryFactory.setup();
+ // For File
+ FSRepositoryFactory.setup();
+ // for SVN (over svn and svn+ssh)
+ SVNRepositoryFactoryImpl.setup();
+
+ // The factory knows how to create a DAVRepository
+ SVNRepository repository = SVNRepositoryFactory.create(SVNURL.parseURIDecoded(url));
+ ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(username, password);
+ repository.setAuthenticationManager(authManager);
+ return repository;
+ }
+
+ public static String createURL( String src,
+ String dst ) throws IOException, SVNException {
+ // First we need to find the absolute path. Note that Maven always runs the tests from the project's directory,
+ // so use new File to create an instance at the current location ...
+ File mySrc = new File(src);
+ File myDst = new File(dst);
+
+ // make sure the destination is empty before we copy
+ FileUtil.delete(myDst);
+ FileUtil.copy(mySrc, myDst);
+
+ // Now set the two path roots
+ String url = myDst.getCanonicalFile().toURI().toURL().toExternalForm();
+
+ url = url.replaceFirst("file:/", "file://localhost/");
+
+ // Have to decode the URL ...
+ SVNURL encodedUrl = SVNURL.parseURIEncoded(url);
+ url = encodedUrl.toDecodedString();
+
+ if (!url.endsWith("/")) url = url + "/";
+ return url;
+ }
+
+ @SuppressWarnings( "unchecked" )
+ public static void listEntries( SVNRepository workspace,
+ String path ) throws SVNException {
+ Collection<SVNDirEntry> entries = workspace.getDir(path, -1, null, (Collection)null);
+ Iterator<SVNDirEntry> iterator = entries.iterator();
+ while (iterator.hasNext()) {
+ SVNDirEntry entry = iterator.next();
+ System.out.println("/" + (path.equals("") ? "" : path + "/") + entry.getName() + " ( author: '" + entry.getAuthor()
+ + "'; revision: " + entry.getRevision() + "; date: " + entry.getDate() + ")");
+ if (entry.getKind() == SVNNodeKind.DIR) {
+ listEntries(workspace, (path.equals("")) ? entry.getName() : path + "/" + entry.getName());
+ }
+ }
+ }
+
+ public static String getRepositoryWorspaceName( SVNRepository repository ) {
+ String[] segments = repository.getLocation().getPath().split("/");
+ return segments[segments.length - 1];
+ }
+
+ private SvnConnectorTestUtil() {
+ // prevent constructor
+ }
+
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnConnectorTestUtil.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnIntegrationTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnIntegrationTest.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnIntegrationTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,110 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import static org.hamcrest.core.Is.is;
+import static org.hamcrest.core.IsNull.notNullValue;
+import static org.junit.Assert.assertThat;
+import java.util.Map;
+import org.jboss.dna.graph.ExecutionContext;
+import org.jboss.dna.graph.Graph;
+import org.jboss.dna.graph.Location;
+import org.jboss.dna.graph.Node;
+import org.jboss.dna.graph.Subgraph;
+import org.jboss.dna.graph.connector.RepositoryConnection;
+import org.jboss.dna.graph.connector.RepositoryConnectionFactory;
+import org.jboss.dna.graph.connector.RepositoryContext;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.observe.Observer;
+import org.jboss.dna.graph.property.Name;
+import org.jboss.dna.graph.property.Property;
+import org.junit.Before;
+import org.junit.Test;
+
+public class SvnIntegrationTest {
+
+ private ExecutionContext context;
+ private SvnRepositorySource source;
+ private String repositoryUrl;
+ private String[] predefinedWorkspaceNames;
+
+ @Before
+ public void beforeEach() {
+ repositoryUrl = "http://anonsvn.jboss.org/repos/dna/";
+ predefinedWorkspaceNames = new String[] {repositoryUrl + "trunk", repositoryUrl + "tags", repositoryUrl + "branches"};
+ context = new ExecutionContext();
+ source = new SvnRepositorySource();
+ source.setName("svn repository source");
+ source.setRepositoryRootUrl(repositoryUrl);
+ source.setUsername("anonymous");
+ source.setPassword("");
+ source.setCreatingWorkspacesAllowed(true);
+ source.setPredefinedWorkspaceNames(predefinedWorkspaceNames);
+ source.setDirectoryForDefaultWorkspace(predefinedWorkspaceNames[0]);
+ source.setCreatingWorkspacesAllowed(false);
+ source.initialize(new RepositoryContext() {
+
+ public Subgraph getConfiguration( int depth ) {
+ return null;
+ }
+
+ public ExecutionContext getExecutionContext() {
+ return context;
+ }
+
+ public Observer getObserver() {
+ return null;
+ }
+
+ public RepositoryConnectionFactory getRepositoryConnectionFactory() {
+ return new RepositoryConnectionFactory() {
+
+ public RepositoryConnection createConnection( String sourceName ) throws RepositorySourceException {
+ return null;
+ }
+
+ };
+ }
+
+ });
+ }
+
+ @Test
+ public void shouldConnectAndReadRootNode() {
+ Graph graph = Graph.create(source, context);
+ Map<Name, Property> properties = graph.getPropertiesByName().on("/");
+ assertThat(properties, is(notNullValue()));
+
+ Node root = graph.getNodeAt("/");
+ assertThat(root, is(notNullValue()));
+ assertThat(root.getLocation(), is(notNullValue()));
+ assertThat(root.getChildren().isEmpty(), is(false));
+ for (Location childLocation : root.getChildren()) {
+ assertThat(childLocation.getPath().getParent().isRoot(), is(true));
+ // Node child = graph.getNodeAt(childLocation);
+ // assertThat(child.getLocation(), is(childLocation));
+ // assertThat(child.getLocation().getPath().getParent().isRoot(), is(true));
+ }
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnIntegrationTest.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18nTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18nTest.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18nTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,35 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import org.jboss.dna.common.AbstractI18nTest;
+
+/**
+ */
+public class SvnRepositoryConnectorI18nTest extends AbstractI18nTest {
+
+ public SvnRepositoryConnectorI18nTest() {
+ super(SvnRepositoryConnectorI18n.class);
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorI18nTest.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNoCreateWorkspaceTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNoCreateWorkspaceTest.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNoCreateWorkspaceTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,68 @@
+package org.jboss.dna.connector.svn2;
+
+import org.jboss.dna.graph.Graph;
+import org.jboss.dna.graph.connector.RepositorySource;
+import org.jboss.dna.graph.connector.test.WorkspaceConnectorTest;
+import org.junit.BeforeClass;
+
+public class SvnRepositoryConnectorNoCreateWorkspaceTest extends WorkspaceConnectorTest {
+
+ private static String url;
+
+ @BeforeClass
+ public static void beforeAny() throws Exception {
+ url = SvnConnectorTestUtil.createURL("src/test/resources/dummy_svn_repos", "target/copy_of dummy_svn_repos");
+
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#setUpSource()
+ */
+ @Override
+ protected RepositorySource setUpSource() throws Exception {
+ String[] predefinedWorkspaceNames = new String[] {url + "trunk", url + "tags"};
+ SvnRepositorySource source = new SvnRepositorySource();
+ source.setName("Test Repository");
+ source.setUsername("sp");
+ source.setPassword("");
+ source.setRepositoryRootUrl(url);
+ source.setPredefinedWorkspaceNames(predefinedWorkspaceNames);
+ source.setDirectoryForDefaultWorkspace(predefinedWorkspaceNames[0]);
+ source.setCreatingWorkspacesAllowed(false);
+
+ return source;
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#initializeContent(org.jboss.dna.graph.Graph)
+ */
+ @Override
+ protected void initializeContent( Graph graph ) throws Exception {
+ // No need to initialize any content ...
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.WorkspaceConnectorTest#generateInvalidNamesForNewWorkspaces()
+ */
+ @Override
+ protected String[] generateInvalidNamesForNewWorkspaces() {
+ return null; // nothing is considered invalid
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.WorkspaceConnectorTest#generateValidNamesForNewWorkspaces()
+ */
+ @Override
+ protected String[] generateValidNamesForNewWorkspaces() {
+ return new String[] {url + "branches"};
+ }
+
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNoCreateWorkspaceTest.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNotWritableTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNotWritableTest.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNotWritableTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,48 @@
+package org.jboss.dna.connector.svn2;
+
+import org.jboss.dna.graph.Graph;
+import org.jboss.dna.graph.connector.RepositorySource;
+import org.jboss.dna.graph.connector.test.NotWritableConnectorTest;
+import org.junit.BeforeClass;
+
+public class SvnRepositoryConnectorNotWritableTest extends NotWritableConnectorTest {
+
+ private static String url;
+
+ @BeforeClass
+ public static void beforeAny() throws Exception {
+ url = SvnConnectorTestUtil.createURL("src/test/resources/dummy_svn_repos", "target/copy_of dummy_svn_repos");
+
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#setUpSource()
+ */
+ @Override
+ protected RepositorySource setUpSource() throws Exception {
+ String[] predefinedWorkspaceNames = new String[]{url+"trunk", url+"tags"};
+ SvnRepositorySource source = new SvnRepositorySource();
+ source.setName("Test Repository");
+ source.setUsername("sp");
+ source.setPassword("");
+ source.setRepositoryRootUrl(url);
+ source.setPredefinedWorkspaceNames(predefinedWorkspaceNames);
+ source.setDirectoryForDefaultWorkspace(predefinedWorkspaceNames[0]);
+ source.setCreatingWorkspacesAllowed(false);
+
+ return source;
+ }
+
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#initializeContent(org.jboss.dna.graph.Graph)
+ */
+ @Override
+ protected void initializeContent( Graph graph ) {
+ // No need to initialize any content ...
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorNotWritableTest.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorWritableTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorWritableTest.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorWritableTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,360 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import static org.hamcrest.core.Is.is;
+import static org.hamcrest.core.IsNull.notNullValue;
+import static org.junit.Assert.assertThat;
+import java.io.ByteArrayOutputStream;
+import org.jboss.dna.graph.DnaLexicon;
+import org.jboss.dna.graph.Graph;
+import org.jboss.dna.graph.JcrLexicon;
+import org.jboss.dna.graph.JcrMixLexicon;
+import org.jboss.dna.graph.JcrNtLexicon;
+import org.jboss.dna.graph.connector.RepositorySource;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.connector.test.AbstractConnectorTest;
+import org.jboss.dna.graph.property.PathNotFoundException;
+import org.junit.Test;
+import org.tmatesoft.svn.core.SVNNodeKind;
+import org.tmatesoft.svn.core.SVNProperties;
+import org.tmatesoft.svn.core.io.SVNRepository;
+
+/**
+ * @author Serge Pagop
+ */
+public class SvnRepositoryConnectorWritableTest extends AbstractConnectorTest {
+
+ protected static final String EMPTY_CONTENT = "";
+ protected static final String TEST_CONTENT = "Test content";
+ protected SVNRepository remoteRepos = null;
+ protected String url;
+ protected SVNNodeKind kind = null;
+ protected SVNProperties fileProperties = null;
+ protected ByteArrayOutputStream baos = null;
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#setUpSource()
+ */
+ @Override
+ protected RepositorySource setUpSource() throws Exception {
+ url = SvnConnectorTestUtil.createURL("src/test/resources/dummy_svn_repos", "target/copy_of dummy_svn_repos");
+ String[] predefinedWorkspaceNames = new String[] {url + "trunk", url + "tags"};
+ SvnRepositorySource source = new SvnRepositorySource();
+ source.setName("Test Repository");
+ source.setUsername("sp");
+ source.setPassword("");
+ source.setRepositoryRootUrl(url);
+ source.setPredefinedWorkspaceNames(predefinedWorkspaceNames);
+ source.setDirectoryForDefaultWorkspace(predefinedWorkspaceNames[0]);
+ source.setCreatingWorkspacesAllowed(Boolean.TRUE);
+ source.setUpdatesAllowed(true);
+
+ remoteRepos = SvnConnectorTestUtil.createRepository(url + "trunk", "sp", "");
+
+ return source;
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#initializeContent(org.jboss.dna.graph.Graph)
+ */
+ @Override
+ protected void initializeContent( Graph graph ) throws Exception {
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#afterEach()
+ */
+ @Override
+ public void afterEach() throws Exception {
+ remoteRepos = null;
+ super.afterEach();
+ }
+
+ @Test( expected = RepositorySourceException.class )
+ public void shouldNotBeAbleToCreateInvalidTypeForRepository() {
+ graph.create("/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.UNSTRUCTURED).orReplace().and();
+ }
+
+ @Test( expected = RepositorySourceException.class )
+ public void shouldNotBeAbleToSetArbitraryProperties() {
+ graph.create("/testFile").with(JcrLexicon.MIXIN_TYPES, JcrMixLexicon.LOCKABLE).orReplace().and();
+ }
+
+ @Test
+ public void shouldBeAbleToCreateNodeFileWithContentLevel1() throws Exception {
+
+ // LEVEL 0
+ graph.create("/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+ kind = remoteRepos.checkPath("testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ graph.create("/testFile1").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFile1/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+ kind = remoteRepos.checkPath("testFile1", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFile1", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ // LEVEL 1
+ graph.create("/root/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/root/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+ kind = remoteRepos.checkPath("root/testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("root/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ // LEVEL 2
+ graph.create("/root/a/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/root/a/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+ kind = remoteRepos.checkPath("root/a/testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("root/a/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+ }
+
+ @Test
+ public void shouldRespectConflictBehaviorOnCreate() throws Exception {
+ graph.create("/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+
+ graph.create("/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ "Should not overwrite".getBytes()).ifAbsent().and();
+
+ kind = remoteRepos.checkPath("testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+ }
+
+ @Test
+ public void shouldBeAbleToCreateFileWithNoContent() throws Exception {
+ graph.create("/testEmptyFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+
+ kind = remoteRepos.checkPath("testEmptyFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testEmptyFile", -1, fileProperties, baos);
+ assertContents(baos, EMPTY_CONTENT);
+ }
+
+ @Test
+ public void shouldBeAbleToCreateFolder() throws Exception {
+ graph.create("/testFolder").orReplace().and();
+
+ kind = remoteRepos.checkPath("testFolder", -1);
+ assertThat(kind == SVNNodeKind.DIR, is(Boolean.TRUE));
+
+ graph.create("/root/testFolder").orReplace().and();
+
+ kind = remoteRepos.checkPath("root/testFolder", -1);
+ assertThat(kind == SVNNodeKind.DIR, is(Boolean.TRUE));
+
+ graph.create("/root/a/testFolder").orReplace().and();
+
+ kind = remoteRepos.checkPath("root/a/testFolder", -1);
+ assertThat(kind == SVNNodeKind.DIR, is(Boolean.TRUE));
+ }
+
+ @Test
+ public void shouldBeAbleToAddChildrenToFolder() throws Exception {
+ graph.create("/testFolder").orReplace().and();
+
+ kind = remoteRepos.checkPath("testFolder", -1);
+ assertThat(kind == SVNNodeKind.DIR, is(Boolean.TRUE));
+
+ graph.create("/testFolder/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFolder/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+
+ kind = remoteRepos.checkPath("testFolder/testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFolder/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ graph.create("/root/testFolder").orReplace().and();
+
+ kind = remoteRepos.checkPath("root/testFolder", -1);
+ assertThat(kind == SVNNodeKind.DIR, is(Boolean.TRUE));
+
+ graph.create("/root/testFolder/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/root/testFolder/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+
+ kind = remoteRepos.checkPath("root/testFolder/testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("root/testFolder/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+ }
+
+ @Test
+ public void shouldBeAbleToCopyFile() throws Exception {
+ graph.create("/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+
+ kind = remoteRepos.checkPath("testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ graph.copy("/testFile").to("/copiedFile");
+ kind = remoteRepos.checkPath("copiedFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("copiedFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+ }
+
+ @Test
+ public void shouldBeAbleToCopyFolder() throws Exception {
+ graph.create("/testFolder").orReplace().and();
+ graph.create("/testFolder/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFolder/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+
+ kind = remoteRepos.checkPath("testFolder/testFile", -1);
+ assertThat(kind == SVNNodeKind.FILE, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFolder/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ graph.copy("/testFolder").to("/copiedFolder");
+ kind = remoteRepos.checkPath("copiedFolder", -1);
+ assertThat(kind == SVNNodeKind.DIR, is(Boolean.TRUE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("copiedFolder/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+ }
+
+ @Test
+ public void shouldBeAbleToDeleteFolder() throws Exception {
+ graph.create("/testFolder").orReplace().and();
+ graph.create("/testFolder/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFolder/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+ kind = remoteRepos.checkPath("testFolder/testFile", -1);
+ assertThat(kind, is(SVNNodeKind.FILE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFolder/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ graph.getNodeAt("/testFolder");
+
+ graph.delete("/testFolder").and();
+
+ try {
+ graph.getNodeAt("/testFolder");
+ } catch (PathNotFoundException expected) {
+ // Expected
+ }
+ }
+
+ @Test
+ public void shouldBeAbleToDeleteFile() throws Exception {
+ graph.create("/testFolder").orReplace().and();
+ graph.create("/testFolder/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFolder/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+ kind = remoteRepos.checkPath("testFolder/testFile", -1);
+ assertThat(kind, is(SVNNodeKind.FILE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFolder/testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ graph.getNodeAt("/testFolder/testFile");
+
+ graph.delete("/testFolder/testFile").and();
+
+ try {
+ graph.getNodeAt("/testFolder/testFile");
+ } catch (PathNotFoundException expected) {
+ // Expected
+ }
+ }
+
+ @Test
+ public void shouldBeAbleToClearFileByRemovingDataProperty() throws Exception {
+ graph.create("/testFile").with(JcrLexicon.PRIMARY_TYPE, JcrNtLexicon.FILE).orReplace().and();
+ graph.create("/testFile/jcr:content").with(JcrLexicon.PRIMARY_TYPE, DnaLexicon.RESOURCE).and(JcrLexicon.DATA,
+ TEST_CONTENT.getBytes()).orReplace().and();
+ kind = remoteRepos.checkPath("testFile", -1);
+ assertThat(kind, is(SVNNodeKind.FILE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFile", -1, fileProperties, baos);
+ assertContents(baos, TEST_CONTENT);
+
+ graph.remove("jcr:data").on("/testFile/jcr:content").and();
+
+ kind = remoteRepos.checkPath("testFile", -1);
+ assertThat(kind, is(SVNNodeKind.FILE));
+ fileProperties = new SVNProperties();
+ baos = new ByteArrayOutputStream();
+ remoteRepos.getFile("testFile", -1, fileProperties, baos);
+ assertContents(baos, "");
+ }
+
+ protected void assertContents( ByteArrayOutputStream baos,
+ String contents ) {
+ assertThat(baos, notNullValue());
+ assertThat(baos.toString(), is(contents));
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositoryConnectorWritableTest.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositorySourceTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositorySourceTest.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositorySourceTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,310 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import static org.hamcrest.core.Is.is;
+import static org.hamcrest.core.IsNull.notNullValue;
+import static org.hamcrest.core.IsNull.nullValue;
+import static org.junit.Assert.assertThat;
+import static org.mockito.Matchers.isNull;
+import static org.mockito.Mockito.mock;
+import java.util.ArrayList;
+import java.util.Enumeration;
+import java.util.HashMap;
+import java.util.Hashtable;
+import java.util.List;
+import java.util.Map;
+import java.util.UUID;
+import java.util.concurrent.TimeUnit;
+import javax.naming.Context;
+import javax.naming.Name;
+import javax.naming.RefAddr;
+import javax.naming.Reference;
+import javax.naming.spi.ObjectFactory;
+import org.jboss.dna.graph.ExecutionContext;
+import org.jboss.dna.graph.Subgraph;
+import org.jboss.dna.graph.cache.BasicCachePolicy;
+import org.jboss.dna.graph.connector.RepositoryConnection;
+import org.jboss.dna.graph.connector.RepositoryConnectionFactory;
+import org.jboss.dna.graph.connector.RepositoryContext;
+import org.jboss.dna.graph.connector.RepositorySourceException;
+import org.jboss.dna.graph.observe.Observer;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+import org.mockito.MockitoAnnotations;
+
+/**
+ * @author Serge Pagop
+ */
+public class SvnRepositorySourceTest {
+
+ private SvnRepositorySource source;
+ private RepositoryConnection connection;
+ private String validName;
+ private String validUuidPropertyName;
+ private static String url;
+ private String username;
+ private String password;
+ private UUID validRootNodeUuid;
+ private final ExecutionContext context = new ExecutionContext();
+
+ @BeforeClass
+ public static void beforeAny() throws Exception {
+ url = SvnConnectorTestUtil.createURL("src/test/resources/dummy_svn_repos", "target/copy_of dummy_svn_repos");
+
+ }
+
+ @Before
+ public void beforeEach() throws Exception {
+ MockitoAnnotations.initMocks(this);
+ this.source = new SvnRepositorySource();
+ // Set the mandatory properties ...
+ this.source.setName("Test Repository");
+ this.source.setUsername("sp");
+ this.source.setPassword("");
+ this.source.setRepositoryRootUrl(url);
+ this.source.initialize(new RepositoryContext() {
+
+ public Subgraph getConfiguration( int depth ) {
+ return null;
+ }
+
+ public ExecutionContext getExecutionContext() {
+ return context;
+ }
+
+ public Observer getObserver() {
+ return null;
+ }
+
+ public RepositoryConnectionFactory getRepositoryConnectionFactory() {
+ return new RepositoryConnectionFactory() {
+
+ public RepositoryConnection createConnection( String sourceName ) throws RepositorySourceException {
+ return null;
+ }
+
+ };
+ }
+
+ });
+ }
+
+ @After
+ public void afterEach() throws Exception {
+ if (this.connection != null) {
+ this.connection.close();
+ }
+ }
+
+ @Test
+ public void shouldReturnNonNullCapabilities() {
+ assertThat(source.getCapabilities(), is(notNullValue()));
+ }
+
+ @Test
+ public void shouldNotSupportSameNameSiblings() {
+ assertThat(source.getCapabilities().supportsSameNameSiblings(), is(false));
+ }
+
+ @Test
+ public void shouldSupportUpdates() {
+ assertThat(source.getCapabilities().supportsUpdates(), is(false));
+ }
+
+ @Test
+ public void shouldHaveNullSourceNameUponConstruction() {
+ source = new SvnRepositorySource();
+ assertThat(source.getName(), is(nullValue()));
+ }
+
+ @Test
+ public void shouldAllowSettingName() {
+ source.setName("name you like");
+ assertThat(source.getName(), is("name you like"));
+ source.setName("name you do not like");
+ assertThat(source.getName(), is("name you do not like"));
+ }
+
+ @Test
+ public void shouldAllowSettingNameToNull() {
+ source.setName("something that can change the world");
+ source.setName(null);
+ assertThat(source.getName(), is(isNull()));
+ }
+
+
+ @Test
+ public void shouldHaveDefaultRetryLimit() {
+ assertThat(source.getRetryLimit(), is(SvnRepositorySource.DEFAULT_RETRY_LIMIT));
+ }
+
+ @Test( expected = IllegalArgumentException.class )
+ public void shouldNotAllowNullSVNUrl() {
+ source.setRepositoryRootUrl(null);
+ }
+
+ @Test( expected = IllegalArgumentException.class )
+ public void shouldNotAllowEmptySVNUrl() {
+ source.setRepositoryRootUrl("");
+ }
+
+ @Test
+ public void shouldSetRetryLimitToZeroWhenSetWithNonPositiveValue() {
+ source.setRetryLimit(0);
+ assertThat(source.getRetryLimit(), is(0));
+ source.setRetryLimit(-1);
+ assertThat(source.getRetryLimit(), is(0));
+ source.setRetryLimit(-100);
+ assertThat(source.getRetryLimit(), is(0));
+ }
+
+ @Test
+ public void shouldAllowRetryLimitToBeSet() {
+ for (int i = 0; i != 100; ++i) {
+ source.setRetryLimit(i);
+ assertThat(source.getRetryLimit(), is(i));
+ }
+ }
+
+ @Test( expected = RepositorySourceException.class )
+ public void shouldFailToCreateConnectionIfSourceHasNoName() {
+ source.setName(null);
+ source.getConnection();
+ }
+
+ @Test( expected = RepositorySourceException.class )
+ public void shouldFailToCreateConnectionIfSourceHasNoUsername() {
+ source.setUsername(null);
+ source.getConnection();
+ }
+
+ @Test( expected = RepositorySourceException.class )
+ public void shouldFailToCreateConnectionIfSourceHasNoPassword() {
+ source.setPassword(null);
+ source.getConnection();
+ }
+
+ @Test
+ public void shouldCreateConnection() throws Exception {
+ connection = source.getConnection();
+ assertThat(connection, is(notNullValue()));
+ }
+
+
+ @Test
+ public void shouldCreateJndiReferenceAndRecreatedObjectFromReference() throws Exception {
+ BasicCachePolicy cachePolicy = new BasicCachePolicy();
+ cachePolicy.setTimeToLive(1000L, TimeUnit.MILLISECONDS);
+ convertToAndFromJndiReference(validName, validRootNodeUuid, url, username, password, validUuidPropertyName, 100);
+ }
+
+ @Test
+ public void shouldCreateJndiReferenceAndRecreatedObjectFromReferenceWithNullProperties() throws Exception {
+ BasicCachePolicy cachePolicy = new BasicCachePolicy();
+ cachePolicy.setTimeToLive(1000L, TimeUnit.MILLISECONDS);
+ convertToAndFromJndiReference("some source", null, "url1", null, null, null, 100);
+ convertToAndFromJndiReference(null, null, "url2", null, null, null, 100);
+ }
+
+ private void convertToAndFromJndiReference( String sourceName,
+ UUID rootNodeUuid,
+ String url,
+ String username,
+ String password,
+ String uuidPropertyName,
+ int retryLimit ) throws Exception {
+ source.setRetryLimit(retryLimit);
+ source.setName(sourceName);
+ source.setRepositoryRootUrl(url);
+ source.setUsername(username);
+ source.setPassword(password);
+
+ Reference ref = source.getReference();
+
+ assertThat(ref.getClassName(), is(SvnRepositorySource.class.getName()));
+ assertThat(ref.getFactoryClassName(), is(SvnRepositorySource.class.getName()));
+
+ Map<String, Object> refAttributes = new HashMap<String, Object>();
+ Enumeration<RefAddr> enumeration = ref.getAll();
+ while (enumeration.hasMoreElements()) {
+ RefAddr addr = enumeration.nextElement();
+ refAttributes.put(addr.getType(), addr.getContent());
+ }
+
+ assertThat((String)refAttributes.remove(SvnRepositorySource.SOURCE_NAME), is(source.getName()));
+ assertThat((String)refAttributes.remove(SvnRepositorySource.SVN_REPOSITORY_ROOT_URL), is(source.getRepositoryRootUrl()));
+ assertThat((String)refAttributes.remove(SvnRepositorySource.SVN_USERNAME), is(source.getUsername()));
+ assertThat((String)refAttributes.remove(SvnRepositorySource.SVN_PASSWORD), is(source.getPassword()));
+ assertThat((String)refAttributes.remove(SvnRepositorySource.ROOT_NODE_UUID), is(source.getRootNodeUuid().toString()));
+ assertThat((String)refAttributes.remove(SvnRepositorySource.RETRY_LIMIT), is(Integer.toString(source.getRetryLimit())));
+ assertThat((String)refAttributes.remove(SvnRepositorySource.ALLOW_CREATING_WORKSPACES),
+ is(Boolean.toString(source.isCreatingWorkspacesAllowed())));
+ assertThat((String)refAttributes.remove(SvnRepositorySource.DEFAULT_WORKSPACE),
+ is(source.getDirectoryForDefaultWorkspace()));
+ refAttributes.remove(SvnRepositorySource.PREDEFINED_WORKSPACE_NAMES);
+ assertThat(refAttributes.isEmpty(), is(true));
+
+ // Recreate the object, use a newly constructed source ...
+ ObjectFactory factory = new SvnRepositorySource();
+ Name name = mock(Name.class);
+ Context context = mock(Context.class);
+ Hashtable<?, ?> env = new Hashtable<Object, Object>();
+ SvnRepositorySource recoveredSource = (SvnRepositorySource)factory.getObjectInstance(ref, name, context, env);
+ assertThat(recoveredSource, is(notNullValue()));
+
+ assertThat(recoveredSource.getName(), is(source.getName()));
+ assertThat(recoveredSource.getRepositoryRootUrl(), is(source.getRepositoryRootUrl()));
+ assertThat(recoveredSource.getUsername(), is(source.getUsername()));
+ assertThat(recoveredSource.getPassword(), is(source.getPassword()));
+
+ assertThat(recoveredSource.equals(source), is(true));
+ assertThat(source.equals(recoveredSource), is(true));
+ }
+
+ @Test
+ public void shouldAllowMultipleConnectionsToBeOpenAtTheSameTime() throws Exception {
+ List<RepositoryConnection> connections = new ArrayList<RepositoryConnection>();
+ try {
+ for (int i = 0; i != 10; ++i) {
+ RepositoryConnection conn = source.getConnection();
+ assertThat(conn, is(notNullValue()));
+ connections.add(conn);
+ }
+ } finally {
+ // Close all open connections ...
+ for (RepositoryConnection conn : connections) {
+ if (conn != null) {
+ try {
+ conn.close();
+ } catch (Throwable t) {
+ t.printStackTrace();
+ }
+ }
+ }
+ }
+ }
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRepositorySourceTest.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
Added: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRespositoryConnectorReadableTest.java
===================================================================
--- trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRespositoryConnectorReadableTest.java (rev 0)
+++ trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRespositoryConnectorReadableTest.java 2010-01-05 12:43:25 UTC (rev 1524)
@@ -0,0 +1,119 @@
+/*
+ * JBoss DNA (http://www.jboss.org/dna)
+ * See the COPYRIGHT.txt file distributed with this work for information
+ * regarding copyright ownership. Some portions may be licensed
+ * to Red Hat, Inc. under one or more contributor license agreements.
+ * See the AUTHORS.txt file in the distribution for a full listing of
+ * individual contributors.
+ *
+ * JBoss DNA is free software. Unless otherwise indicated, all code in JBoss DNA
+ * is licensed to you under the terms of the GNU Lesser General Public License as
+ * published by the Free Software Foundation; either version 2.1 of
+ * the License, or (at your option) any later version.
+ *
+ * JBoss DNA is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this software; if not, write to the Free
+ * Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
+ * 02110-1301 USA, or see the FSF site: http://www.fsf.org.
+ */
+package org.jboss.dna.connector.svn2;
+
+import static org.hamcrest.core.Is.is;
+import static org.hamcrest.core.IsNull.notNullValue;
+import static org.junit.Assert.assertThat;
+import java.util.List;
+import org.jboss.dna.graph.Graph;
+import org.jboss.dna.graph.JcrLexicon;
+import org.jboss.dna.graph.JcrNtLexicon;
+import org.jboss.dna.graph.Location;
+import org.jboss.dna.graph.Node;
+import org.jboss.dna.graph.connector.RepositorySource;
+import org.jboss.dna.graph.connector.test.ReadableConnectorTest;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+/**
+ * @author Serge Pagop
+ *
+ */
+public class SvnRespositoryConnectorReadableTest extends ReadableConnectorTest {
+
+ private static String url;
+
+ @BeforeClass
+ public static void beforeAny() throws Exception {
+ url = SvnConnectorTestUtil.createURL("src/test/resources/dummy_svn_repos", "target/copy_of dummy_svn_repos");
+
+ }
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#setUpSource()
+ */
+ @Override
+ protected RepositorySource setUpSource() throws Exception {
+ String[] predefinedWorkspaceNames = new String[]{url + "trunk", url + "tags"};
+ SvnRepositorySource source = new SvnRepositorySource();
+ source.setName("Test Repository");
+ source.setUsername("sp");
+ source.setPassword("");
+ source.setRepositoryRootUrl(url);
+ source.setPredefinedWorkspaceNames(predefinedWorkspaceNames);
+ source.setDirectoryForDefaultWorkspace(predefinedWorkspaceNames[0]);
+ source.setCreatingWorkspacesAllowed(false);
+
+ return source;
+ }
+
+
+ /**
+ * {@inheritDoc}
+ *
+ * @see org.jboss.dna.graph.connector.test.AbstractConnectorTest#initializeContent(org.jboss.dna.graph.Graph)
+ */
+ @Override
+ protected void initializeContent( Graph graph ) throws Exception {
+ // No need to initialize any content ...
+ }
+
+ @Test
+ public void shouldFindFolderSpecifiedInPathsAsNodesBelowRoot() {
+ Node root = graph.getNodeAt("/root");
+ assertThatNodeIsFolder(root);
+ Node dnaSubmission = graph.getNodeAt("/root/c/h/JBoss DNA Submission Receipt for JBoss World 2009.pdf");
+ assertThatNodeIsFile(dnaSubmission, "application/octet-stream", null);
+ }
+
+ public void assertThatNodeIsFolder( Node node ) {
+ assertThat(node, is(notNullValue()));
+ assertThat(node.getProperty(JcrLexicon.PRIMARY_TYPE).getFirstValue(), is((Object)JcrNtLexicon.FOLDER));
+ }
+
+ public void assertThatNodeIsFile( Node node,
+ String mimeType,
+ String contents ) {
+ assertThat(node, is(notNullValue()));
+ assertThat(node.getProperty(JcrLexicon.PRIMARY_TYPE).getFirstValue(), is((Object)JcrNtLexicon.FILE));
+
+ // Check that there is one child, and that the child is "jcr:content" ...
+ List<Location> children = node.getChildren();
+ assertThat(children.size(), is(1));
+ Location jcrContentLocation = children.get(0);
+ assertThat(jcrContentLocation.getPath().getLastSegment().getName(), is(JcrLexicon.CONTENT));
+
+ // Check that the "jcr:content" node is correct ...
+ Node jcrContent = graph.getNodeAt(jcrContentLocation);
+ assertThat(string(jcrContent.getProperty(JcrLexicon.MIMETYPE).getFirstValue()), is(mimeType));
+ if (contents != null) {
+ assertThat(string(jcrContent.getProperty(JcrLexicon.DATA).getFirstValue()), is(contents));
+ }
+
+ }
+
+}
Property changes on: trunk/extensions/dna-connector-svn/src/test/java/org/jboss/dna/connector/svn2/SvnRespositoryConnectorReadableTest.java
___________________________________________________________________
Name: svn:keywords
+ Id Revision
Name: svn:eol-style
+ LF
14 years, 4 months
DNA SVN: r1523 - in trunk: dna-common/src/test/java/org/jboss/dna/common/util and 9 other directories.
by dna-commits@lists.jboss.org
Author: elvisisking
Date: 2010-01-04 15:58:21 -0500 (Mon, 04 Jan 2010)
New Revision: 1523
Modified:
trunk/dna-common/src/main/java/org/jboss/dna/common/util/ClassUtil.java
trunk/dna-common/src/test/java/org/jboss/dna/common/util/HashCodeTest.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlLexicon.java
trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/IJcrConstants.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Status.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Utils.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/http/HttpClientConnection.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FileNode.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FolderNode.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/IJsonConstants.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonNode.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClient.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonUtils.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/RepositoryNode.java
trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/ServerNode.java
trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/StatusTest.java
trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/UtilsTest.java
trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/RepositoryTest.java
trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/ServerTest.java
trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/WorkspaceTest.java
trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClientTest.java
Log:
DNA-619 Remove Eclipse IDE Comments That Denote A String Need Not Be Externalized: All //NON-NLS-x$ comments have been removed from the codebase.
Modified: trunk/dna-common/src/main/java/org/jboss/dna/common/util/ClassUtil.java
===================================================================
--- trunk/dna-common/src/main/java/org/jboss/dna/common/util/ClassUtil.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/dna-common/src/main/java/org/jboss/dna/common/util/ClassUtil.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -94,7 +94,7 @@
private static boolean addSeparator( boolean separatorNeeded,
StringBuffer text ) {
if (separatorNeeded) {
- text.append(", "); //$NON-NLS-1$
+ text.append(", ");
}
return true;
}
@@ -125,7 +125,7 @@
*/
public static String nonPackageQualifiedName( final Class<?> clazz ) {
// if (clazz == null) {
- // throw new IllegalArgumentException(I18n.format(CommonI18n.mustNotBeNull, "Class")); //$NON-NLS-1$
+ // throw new IllegalArgumentException(I18n.format(CommonI18n.mustNotBeNull, "Class"));
// }
String name = clazz.getName();
return name.substring(name.lastIndexOf('.') + 1);
@@ -138,7 +138,7 @@
*/
public static String nonPackageQualifiedName( final Object object ) {
// if (object == null) {
- // throw new IllegalArgumentException(I18n.format(CommonI18n.mustNotBeNull, "Object")); //$NON-NLS-1$
+ // throw new IllegalArgumentException(I18n.format(CommonI18n.mustNotBeNull, "Object"));
// }
return nonPackageQualifiedName(object.getClass());
}
Modified: trunk/dna-common/src/test/java/org/jboss/dna/common/util/HashCodeTest.java
===================================================================
--- trunk/dna-common/src/test/java/org/jboss/dna/common/util/HashCodeTest.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/dna-common/src/test/java/org/jboss/dna/common/util/HashCodeTest.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -59,7 +59,7 @@
@Test
public void shouldAcceptNullArguments() {
assertThat(HashCode.compute((Object)null), is(0));
- assertThat(HashCode.compute("abc", (Object)null), is(not(0))); //$NON-NLS-1$
+ assertThat(HashCode.compute("abc", (Object)null), is(not(0)));
}
}
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/derby/DerbyDdlLexicon.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -84,7 +84,7 @@
public static final Name TYPE_DROP_SYNONYM_STATEMENT = new BasicName(Namespace.URI, "dropSynonymStatement");
public static final Name TYPE_DROP_TRIGGER_STATEMENT = new BasicName(Namespace.URI, "dropTriggerStatement");
- public static final Name UNIQUE_INDEX = new BasicName(Namespace.URI, "unique"); //$NON-NLS-1$
- public static final Name TABLE_NAME = new BasicName(Namespace.URI, "tableName"); //$NON-NLS-1$
- public static final Name GENERATED_COLUMN_SPEC_CLAUSE = new BasicName(Namespace.URI, "generatedColumnSpecClause"); //$NON-NLS-1$
+ public static final Name UNIQUE_INDEX = new BasicName(Namespace.URI, "unique");
+ public static final Name TABLE_NAME = new BasicName(Namespace.URI, "tableName");
+ public static final Name GENERATED_COLUMN_SPEC_CLAUSE = new BasicName(Namespace.URI, "generatedColumnSpecClause");
}
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlLexicon.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlLexicon.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/oracle/OracleDdlLexicon.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -37,7 +37,7 @@
}
// MIXINS
- public static final Name TYPE_BACKSLASH_TERMINATOR = new BasicName(Namespace.URI, "backslashTerminator"); //$NON-NLS-1$
+ public static final Name TYPE_BACKSLASH_TERMINATOR = new BasicName(Namespace.URI, "backslashTerminator");
public static final Name TYPE_CREATE_CLUSTER_STATEMENT = new BasicName(Namespace.URI, "createIndexStatement");
public static final Name TYPE_CREATE_CONTEXT_STATEMENT = new BasicName(Namespace.URI, "createContextStatement");
@@ -151,11 +151,11 @@
public static final Name TYPE_RENAME_CONSTRAINT = new BasicName(Namespace.URI, "renameConstraint");
// PROPERTY NAMES
- public static final Name TARGET_OBJECT_TYPE = new BasicName(Namespace.URI, "targetObjectType"); //$NON-NLS-1$
- public static final Name TARGET_OBJECT_NAME = new BasicName(Namespace.URI, "targetObjectName"); //$NON-NLS-1$
- public static final Name COMMENT = new BasicName(Namespace.URI, "comment"); //$NON-NLS-1$
- public static final Name UNIQUE_INDEX = new BasicName(Namespace.URI, "unique"); //$NON-NLS-1$
- public static final Name BITMAP_INDEX = new BasicName(Namespace.URI, "bitmap"); //$NON-NLS-1$
- public static final Name TABLE_NAME = new BasicName(Namespace.URI, "tableName"); //$NON-NLS-1$
+ public static final Name TARGET_OBJECT_TYPE = new BasicName(Namespace.URI, "targetObjectType");
+ public static final Name TARGET_OBJECT_NAME = new BasicName(Namespace.URI, "targetObjectName");
+ public static final Name COMMENT = new BasicName(Namespace.URI, "comment");
+ public static final Name UNIQUE_INDEX = new BasicName(Namespace.URI, "unique");
+ public static final Name BITMAP_INDEX = new BasicName(Namespace.URI, "bitmap");
+ public static final Name TABLE_NAME = new BasicName(Namespace.URI, "tableName");
}
Modified: trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java
===================================================================
--- trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/extensions/dna-sequencer-ddl/src/main/java/org/jboss/dna/sequencer/ddl/dialect/postgres/PostgresDdlLexicon.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -138,7 +138,7 @@
public static final Name SCHEMA_NAME = new BasicName(Namespace.URI, "schemaName");
// PROPERTY NAMES
- public static final Name TARGET_OBJECT_TYPE = new BasicName(Namespace.URI, "targetObjectType"); //$NON-NLS-1$
- public static final Name TARGET_OBJECT_NAME = new BasicName(Namespace.URI, "targetObjectName"); //$NON-NLS-1$
- public static final Name COMMENT = new BasicName(Namespace.URI, "comment"); //$NON-NLS-1$
+ public static final Name TARGET_OBJECT_TYPE = new BasicName(Namespace.URI, "targetObjectType");
+ public static final Name TARGET_OBJECT_NAME = new BasicName(Namespace.URI, "targetObjectName");
+ public static final Name COMMENT = new BasicName(Namespace.URI, "comment");
}
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/IJcrConstants.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/IJcrConstants.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/IJcrConstants.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -31,41 +31,41 @@
/**
* The JCR content property name (<code>jcr:content</code>).
*/
- String CONTENT_PROPERTY = "jcr:content"; //$NON-NLS-1$
+ String CONTENT_PROPERTY = "jcr:content";
/**
* The JCR data property name (<code>jcr:data</code>).
*/
- String DATA_PROPERTY = "jcr:data/base64/"; //$NON-NLS-1$
+ String DATA_PROPERTY = "jcr:data/base64/";
/**
* The JCR file node type (<code>nt:file</code>).
*/
- String FILE_NODE_TYPE = "nt:file"; //$NON-NLS-1$
+ String FILE_NODE_TYPE = "nt:file";
/**
* The JCR folder node type (<code>nt:folder</code>).
*/
- String FOLDER_NODE_TYPE = "nt:folder"; //$NON-NLS-1$
+ String FOLDER_NODE_TYPE = "nt:folder";
/**
* The JCR data property name (<code>jcr:lastModified</code>).
*/
- String LAST_MODIFIED = "jcr:lastModified"; //$NON-NLS-1$
+ String LAST_MODIFIED = "jcr:lastModified";
/**
* The JCR data property name (<code>jcr:lastModified</code>).
*/
- String MIME_TYPE = "jcr:mimeType"; //$NON-NLS-1$
+ String MIME_TYPE = "jcr:mimeType";
/**
* The JCR primary type property name (<code>jcr:primaryType</code>).
*/
- String PRIMARY_TYPE_PROPERTY = "jcr:primaryType"; //$NON-NLS-1$
+ String PRIMARY_TYPE_PROPERTY = "jcr:primaryType";
/**
* The JCR resource node type (<code>nt:resource</code>).
*/
- String RESOURCE_NODE_TYPE = "nt:resource"; //$NON-NLS-1$
+ String RESOURCE_NODE_TYPE = "nt:resource";
}
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Status.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Status.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Status.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -123,7 +123,7 @@
* @return the status message (never <code>null</code> but can be empty)
*/
public String getMessage() {
- return ((this.message == null) ? "" : this.message); //$NON-NLS-1$
+ return ((this.message == null) ? "" : this.message);
}
/**
@@ -175,11 +175,11 @@
*/
@Override
public String toString() {
- StringBuilder txt = new StringBuilder("Status "); //$NON-NLS-1$
- txt.append(this.severity.toString()).append(": "); //$NON-NLS-1$
- txt.append((getMessage().length() == 0) ? "<no message>" : getMessage()); //$NON-NLS-1$
- txt.append(" : "); //$NON-NLS-1$
- txt.append((getException() == null) ? "<no error>" : getException()); //$NON-NLS-1$
+ StringBuilder txt = new StringBuilder("Status ");
+ txt.append(this.severity.toString()).append(": ");
+ txt.append((getMessage().length() == 0) ? "<no message>" : getMessage());
+ txt.append(" : ");
+ txt.append((getException() == null) ? "<no error>" : getException());
return txt.toString();
}
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Utils.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Utils.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/Utils.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -68,7 +68,7 @@
public static String getMimeType( File file ) {
if (mimeTypeUtils == null) {
// load custom extensions
- InputStream stream = Thread.currentThread().getContextClassLoader().getResourceAsStream("org/jboss/dna/web/jcr/rest/client/mime.types"); //$NON-NLS-1$
+ InputStream stream = Thread.currentThread().getContextClassLoader().getResourceAsStream("org/jboss/dna/web/jcr/rest/client/mime.types");
Map<String, String> customMap = MimeTypeUtil.load(stream, null);
// construct
@@ -76,7 +76,7 @@
}
String mimeType = mimeTypeUtils.mimeTypeOf(file);
- return ((mimeType == null) ? "application/octet-stream" : mimeType); //$NON-NLS-1$
+ return ((mimeType == null) ? "application/octet-stream" : mimeType);
}
// ===========================================================================================================================
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/http/HttpClientConnection.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/http/HttpClientConnection.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/http/HttpClientConnection.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -80,9 +80,9 @@
public HttpClientConnection( Server server,
URL url,
RequestMethod method ) throws Exception {
- CheckArg.isNotNull(server, "server"); //$NON-NLS-1$
- CheckArg.isNotNull(url, "url"); //$NON-NLS-1$
- CheckArg.isNotNull(method, "method"); //$NON-NLS-1$
+ CheckArg.isNotNull(server, "server");
+ CheckArg.isNotNull(url, "url");
+ CheckArg.isNotNull(method, "method");
this.httpClient = new DefaultHttpClient();
this.httpClient.getCredentialsProvider().setCredentials(new AuthScope(url.getHost(), url.getPort()),
@@ -149,7 +149,7 @@
* @throws Exception if there is a problem writing to the connection
*/
public void write( byte[] bytes ) throws Exception {
- CheckArg.isNotNull(bytes, "bytes"); //$NON-NLS-1$
+ CheckArg.isNotNull(bytes, "bytes");
ByteArrayEntity entity = new ByteArrayEntity(bytes);
entity.setContentType(MediaType.APPLICATION_JSON);
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FileNode.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FileNode.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FileNode.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -78,8 +78,8 @@
File file ) throws Exception {
super(file.getName());
- CheckArg.isNotNull(workspace, "workspace"); //$NON-NLS-1$
- CheckArg.isNotNull(path, "path"); //$NON-NLS-1$
+ CheckArg.isNotNull(workspace, "workspace");
+ CheckArg.isNotNull(path, "path");
this.file = file;
this.path = path;
@@ -106,8 +106,8 @@
// add required jcr:lastModified property
Calendar lastModified = Calendar.getInstance();
lastModified.setTimeInMillis(file.lastModified());
- SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"); //$NON-NLS-1$
- formatter.setTimeZone(TimeZone.getTimeZone("GMT")); //$NON-NLS-1$
+ SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'");
+ formatter.setTimeZone(TimeZone.getTimeZone("GMT"));
properties.put(IJcrConstants.LAST_MODIFIED, formatter.format(lastModified.getTime()));
// add required jcr:mimeType property (just use a default value)
@@ -143,7 +143,7 @@
* @see #getFileContentsUrl()
*/
String getFileContents( String jsonResponse ) throws Exception {
- CheckArg.isNotNull(jsonResponse, "jsonResponse"); //$NON-NLS-1$
+ CheckArg.isNotNull(jsonResponse, "jsonResponse");
JSONObject contentNode = new JSONObject(jsonResponse);
JSONObject props = (JSONObject)contentNode.get(IJsonConstants.PROPERTIES_KEY);
String encodedContents = props.getString(IJcrConstants.DATA_PROPERTY);
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FolderNode.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FolderNode.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/FolderNode.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -59,8 +59,8 @@
String fullPath ) throws Exception {
super(fullPath);
- CheckArg.isNotNull(workspace, "workspace"); //$NON-NLS-1$
- CheckArg.isNotNull(fullPath, "fullPath"); //$NON-NLS-1$
+ CheckArg.isNotNull(workspace, "workspace");
+ CheckArg.isNotNull(fullPath, "fullPath");
this.workspace = workspace;
@@ -96,12 +96,12 @@
// make sure path starts with a '/'
String path = getPath();
- if (!path.startsWith("/")) { //$NON-NLS-1$
+ if (!path.startsWith("/")) {
path = '/' + path;
}
// make sure path does NOT end with a '/'
- if (path.endsWith("/")) { //$NON-NLS-1$
+ if (path.endsWith("/")) {
path = path.substring(0, path.length() - 1);
}
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/IJsonConstants.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/IJsonConstants.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/IJsonConstants.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -57,21 +57,21 @@
/**
* The key in the <code>JSONObject</code> whose value is the collection of node children.
*/
- String CHILDREN_KEY = "children"; //$NON-NLS-1$
+ String CHILDREN_KEY = "children";
/**
* The key in the <code>JSONObject</code> whose value is the collection of node properties.
*/
- String PROPERTIES_KEY = "properties"; //$NON-NLS-1$
+ String PROPERTIES_KEY = "properties";
/**
* The server context added to URLs.
*/
- String SERVER_CONTEXT = "/resources"; //$NON-NLS-1$
+ String SERVER_CONTEXT = "/resources";
/**
* The workspace context added to the URLs.
*/
- String WORKSPACE_CONTEXT = "/items"; //$NON-NLS-1$
+ String WORKSPACE_CONTEXT = "/items";
}
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonNode.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonNode.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonNode.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -50,7 +50,7 @@
* @param id the node identifier (never <code>null</code>)
*/
protected JsonNode( String id ) {
- CheckArg.isNotNull(id, "id"); //$NON-NLS-1$
+ CheckArg.isNotNull(id, "id");
this.id = id;
}
@@ -87,16 +87,16 @@
@Override
public String toString() {
StringBuilder txt = new StringBuilder();
- txt.append("ID: ").append(getId()); //$NON-NLS-1$
- txt.append(", URL: "); //$NON-NLS-1$
+ txt.append("ID: ").append(getId());
+ txt.append(", URL: ");
try {
txt.append(getUrl());
} catch (Exception e) {
- txt.append("exception obtaining URL"); //$NON-NLS-1$
+ txt.append("exception obtaining URL");
}
- txt.append(", content: ").append(super.toString()); //$NON-NLS-1$
+ txt.append(", content: ").append(super.toString());
return txt.toString();
}
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClient.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClient.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClient.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -68,7 +68,7 @@
private HttpClientConnection connect( Server server,
URL url,
RequestMethod method ) throws Exception {
- this.logger.trace("connect: url={0}, method={1}", url, method); //$NON-NLS-1$
+ this.logger.trace("connect: url={0}, method={1}", url, method);
return new HttpClientConnection(server, url, method);
}
@@ -83,12 +83,12 @@
private void createFileNode( Workspace workspace,
String path,
File file ) throws Exception {
- this.logger.trace("createFileNode: workspace={0}, path={1}, file={2}", workspace.getName(), path, file.getAbsolutePath()); //$NON-NLS-1$
+ this.logger.trace("createFileNode: workspace={0}, path={1}, file={2}", workspace.getName(), path, file.getAbsolutePath());
FileNode fileNode = new FileNode(workspace, path, file);
HttpClientConnection connection = connect(workspace.getServer(), fileNode.getUrl(), RequestMethod.POST);
try {
- this.logger.trace("createFileNode: create node={0}", fileNode); //$NON-NLS-1$
+ this.logger.trace("createFileNode: create node={0}", fileNode);
connection.write(fileNode.getContent());
// make sure node was created
@@ -96,13 +96,13 @@
if (responseCode != HttpURLConnection.HTTP_CREATED) {
// node was not created
- this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "createFileNode"); //$NON-NLS-1$
+ this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "createFileNode");
String msg = RestClientI18n.createFileFailedMsg.text(file.getName(), path, workspace.getName(), responseCode);
throw new RuntimeException(msg);
}
} finally {
if (connection != null) {
- this.logger.trace("createFileNode: leaving"); //$NON-NLS-1$
+ this.logger.trace("createFileNode: leaving");
connection.disconnect();
}
}
@@ -117,12 +117,12 @@
*/
private void createFolderNode( Workspace workspace,
String path ) throws Exception {
- this.logger.trace("createFolderNode: workspace={0}, path={1}", workspace.getName(), path); //$NON-NLS-1$
+ this.logger.trace("createFolderNode: workspace={0}, path={1}", workspace.getName(), path);
FolderNode folderNode = new FolderNode(workspace, path);
HttpClientConnection connection = connect(workspace.getServer(), folderNode.getUrl(), RequestMethod.POST);
try {
- this.logger.trace("createFolderNode: create node={0}", folderNode); //$NON-NLS-1$
+ this.logger.trace("createFolderNode: create node={0}", folderNode);
connection.write(folderNode.getContent());
// make sure node was created
@@ -130,13 +130,13 @@
if (responseCode != HttpURLConnection.HTTP_CREATED) {
// node was not created
- this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "createFolderNode"); //$NON-NLS-1$
+ this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "createFolderNode");
String msg = RestClientI18n.createFolderFailedMsg.text(path, workspace.getName(), responseCode);
throw new RuntimeException(msg);
}
} finally {
if (connection != null) {
- this.logger.trace("createFolderNode: leaving"); //$NON-NLS-1$
+ this.logger.trace("createFolderNode: leaving");
connection.disconnect();
}
}
@@ -151,7 +151,7 @@
*/
private void ensureFolderExists( Workspace workspace,
String folderPath ) throws Exception {
- this.logger.trace("ensureFolderExists: workspace={0}, path={1}", workspace.getName(), folderPath); //$NON-NLS-1$
+ this.logger.trace("ensureFolderExists: workspace={0}, path={1}", workspace.getName(), folderPath);
FolderNode folderNode = new FolderNode(workspace, folderPath);
if (!pathExists(workspace.getServer(), folderNode.getUrl())) {
@@ -189,8 +189,8 @@
* @see org.jboss.dna.web.jcr.rest.client.IRestClient#getRepositories(org.jboss.dna.web.jcr.rest.client.domain.Server)
*/
public Collection<Repository> getRepositories( Server server ) throws Exception {
- CheckArg.isNotNull(server, "server"); //$NON-NLS-1$
- this.logger.trace("getRepositories: server={0}", server); //$NON-NLS-1$
+ CheckArg.isNotNull(server, "server");
+ this.logger.trace("getRepositories: server={0}", server);
ServerNode serverNode = new ServerNode(server);
HttpClientConnection connection = connect(server, serverNode.getFindRepositoriesUrl(), RequestMethod.GET);
@@ -203,12 +203,12 @@
}
// not a good response code
- this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "getRepositories"); //$NON-NLS-1$
+ this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "getRepositories");
String msg = RestClientI18n.getRepositoriesFailedMsg.text(server.getName(), responseCode);
throw new RuntimeException(msg);
} finally {
if (connection != null) {
- this.logger.trace("getRepositories: leaving"); //$NON-NLS-1$
+ this.logger.trace("getRepositories: leaving");
connection.disconnect();
}
}
@@ -223,9 +223,9 @@
public URL getUrl( File file,
String path,
Workspace workspace ) throws Exception {
- CheckArg.isNotNull(file, "file"); //$NON-NLS-1$
- CheckArg.isNotNull(path, "path"); //$NON-NLS-1$
- CheckArg.isNotNull(workspace, "workspace"); //$NON-NLS-1$
+ CheckArg.isNotNull(file, "file");
+ CheckArg.isNotNull(path, "path");
+ CheckArg.isNotNull(workspace, "workspace");
// can't be a directory
if (file.isDirectory()) {
@@ -241,8 +241,8 @@
* @see org.jboss.dna.web.jcr.rest.client.IRestClient#getWorkspaces(org.jboss.dna.web.jcr.rest.client.domain.Repository)
*/
public Collection<Workspace> getWorkspaces( Repository repository ) throws Exception {
- CheckArg.isNotNull(repository, "repository"); //$NON-NLS-1$
- this.logger.trace("getWorkspaces: repository={0}", repository); //$NON-NLS-1$
+ CheckArg.isNotNull(repository, "repository");
+ this.logger.trace("getWorkspaces: repository={0}", repository);
RepositoryNode repositoryNode = new RepositoryNode(repository);
HttpClientConnection connection = connect(repository.getServer(), repositoryNode.getUrl(), RequestMethod.GET);
@@ -255,14 +255,14 @@
}
// not a good response code
- this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "getWorkspaces"); //$NON-NLS-1$
+ this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "getWorkspaces");
String msg = RestClientI18n.getWorkspacesFailedMsg.text(repository.getName(),
repository.getServer().getName(),
responseCode);
throw new RuntimeException(msg);
} finally {
if (connection != null) {
- this.logger.trace("getWorkspaces: leaving"); //$NON-NLS-1$
+ this.logger.trace("getWorkspaces: leaving");
connection.disconnect();
}
}
@@ -300,16 +300,16 @@
*/
private boolean pathExists( Server server,
URL url ) throws Exception {
- this.logger.trace("pathExists: url={0}", url); //$NON-NLS-1$
+ this.logger.trace("pathExists: url={0}", url);
HttpClientConnection connection = connect(server, url, RequestMethod.GET);
try {
int responseCode = connection.getResponseCode();
- this.logger.trace("pathExists: responseCode={0}", responseCode); //$NON-NLS-1$
+ this.logger.trace("pathExists: responseCode={0}", responseCode);
return (responseCode == HttpURLConnection.HTTP_OK);
} finally {
if (connection != null) {
- this.logger.trace("pathExists: leaving"); //$NON-NLS-1$
+ this.logger.trace("pathExists: leaving");
connection.disconnect();
}
}
@@ -338,10 +338,10 @@
public Status publish( Workspace workspace,
String path,
File file ) {
- CheckArg.isNotNull(workspace, "workspace"); //$NON-NLS-1$
- CheckArg.isNotNull(path, "path"); //$NON-NLS-1$
- CheckArg.isNotNull(file, "file"); //$NON-NLS-1$
- this.logger.trace("publish: workspace={0}, path={1}, file={2}", workspace.getName(), path, file.getAbsolutePath()); //$NON-NLS-1$
+ CheckArg.isNotNull(workspace, "workspace");
+ CheckArg.isNotNull(path, "path");
+ CheckArg.isNotNull(file, "file");
+ this.logger.trace("publish: workspace={0}, path={1}, file={2}", workspace.getName(), path, file.getAbsolutePath());
try {
// first delete if file exists at that path
@@ -371,10 +371,10 @@
public Status unpublish( Workspace workspace,
String path,
File file ) {
- CheckArg.isNotNull(workspace, "workspace"); //$NON-NLS-1$
- CheckArg.isNotNull(path, "path"); //$NON-NLS-1$
- CheckArg.isNotNull(file, "file"); //$NON-NLS-1$
- this.logger.trace("unpublish: workspace={0}, path={1}, file={2}", workspace.getName(), path, file.getAbsolutePath()); //$NON-NLS-1$
+ CheckArg.isNotNull(workspace, "workspace");
+ CheckArg.isNotNull(path, "path");
+ CheckArg.isNotNull(file, "file");
+ this.logger.trace("unpublish: workspace={0}, path={1}, file={2}", workspace.getName(), path, file.getAbsolutePath());
HttpClientConnection connection = null;
@@ -382,7 +382,7 @@
FileNode fileNode = new FileNode(workspace, path, file);
connection = connect(workspace.getServer(), fileNode.getUrl(), RequestMethod.DELETE);
int responseCode = connection.getResponseCode();
- this.logger.trace("responseCode={0}", responseCode); //$NON-NLS-1$
+ this.logger.trace("responseCode={0}", responseCode);
if (responseCode != HttpURLConnection.HTTP_NO_CONTENT) {
// check to see if the file was never published
@@ -392,7 +392,7 @@
}
// unexpected result
- this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "unpublish"); //$NON-NLS-1$
+ this.logger.error(RestClientI18n.connectionErrorMsg, responseCode, "unpublish");
String msg = RestClientI18n.unpublishFailedMsg.text(file.getName(), workspace.getName(), path);
throw new RuntimeException(msg);
}
@@ -403,7 +403,7 @@
return new Status(Severity.ERROR, msg, e);
} finally {
if (connection != null) {
- this.logger.trace("unpublish: leaving"); //$NON-NLS-1$
+ this.logger.trace("unpublish: leaving");
connection.disconnect();
}
}
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonUtils.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonUtils.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/JsonUtils.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -43,7 +43,7 @@
/**
* The default character set being used.
*/
- private static final String DEFAULT_CHARSET = "UTF-8"; //$NON-NLS-1$ // TODO need to property drive charset
+ private static final String DEFAULT_CHARSET = "UTF-8"; // TODO need to property drive charset
// ===========================================================================================================================
// Class Methods
@@ -55,7 +55,7 @@
* @throws UnsupportedEncodingException if the charset is not supported
*/
public static String decode( String text ) throws UnsupportedEncodingException {
- CheckArg.isNotNull(text, "text"); //$NON-NLS-1$
+ CheckArg.isNotNull(text, "text");
return URLDecoder.decode(text, DEFAULT_CHARSET);
}
@@ -67,7 +67,7 @@
* @throws UnsupportedEncodingException if the charset is not supported
*/
public static String encode( String text ) throws UnsupportedEncodingException {
- CheckArg.isNotNull(text, "text"); //$NON-NLS-1$
+ CheckArg.isNotNull(text, "text");
// don't encode '/' as it needs to stay that way in the URL
StringBuilder encoded = new StringBuilder();
@@ -87,7 +87,7 @@
* @throws IOException if there is a problem reading from the connection
*/
public static String readInputStream( HttpURLConnection connection ) throws IOException {
- CheckArg.isNotNull(connection, "connection"); //$NON-NLS-1$
+ CheckArg.isNotNull(connection, "connection");
InputStream stream = connection.getInputStream();
int bytesRead;
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/RepositoryNode.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/RepositoryNode.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/RepositoryNode.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -87,7 +87,7 @@
*/
@SuppressWarnings( "unchecked" )
public Collection<Workspace> getWorkspaces( String jsonResponse ) throws Exception {
- CheckArg.isNotNull(jsonResponse, "jsonResponse"); //$NON-NLS-1$
+ CheckArg.isNotNull(jsonResponse, "jsonResponse");
Collection<Workspace> workspaces = new ArrayList<Workspace>();
JSONObject jsonObj = new JSONObject(jsonResponse);
Modified: trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/ServerNode.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/ServerNode.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/main/java/org/jboss/dna/web/jcr/rest/client/json/ServerNode.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -81,13 +81,12 @@
StringBuilder url = new StringBuilder(this.server.getUrl());
// strip off last '/' if necessary
- if (url.lastIndexOf("/") == (url.length() - 1)) { //$NON-NLS-1$
+ if (url.lastIndexOf("/") == (url.length() - 1)) {
url.delete((url.length() - 1), (url.length() - 1));
}
- // append server context and insert user
+ // append server context
url.append(IJsonConstants.SERVER_CONTEXT);
- // url.insert(url.indexOf(":") + 1, this.server.getUser() + '@'); //$NON-NLS-1$
return new URL(url.toString());
}
@@ -107,9 +106,9 @@
*/
@SuppressWarnings( "unchecked" )
public Collection<Repository> getRepositories( String jsonResponse ) throws Exception {
- CheckArg.isNotNull(jsonResponse, "jsonResponse"); //$NON-NLS-1$
+ CheckArg.isNotNull(jsonResponse, "jsonResponse");
Collection<Repository> repositories = new ArrayList<Repository>();
- this.logger.trace("getRepositories:jsonResponse={0}", jsonResponse); //$NON-NLS-1$
+ this.logger.trace("getRepositories:jsonResponse={0}", jsonResponse);
JSONObject jsonObj = new JSONObject(jsonResponse);
// keys are the repository names
@@ -117,7 +116,7 @@
String name = JsonUtils.decode(itr.next());
Repository repository = new Repository(name, this.server);
repositories.add(repository);
- this.logger.trace("getRepositories: adding repository={0}", repository); //$NON-NLS-1$
+ this.logger.trace("getRepositories: adding repository={0}", repository);
}
return repositories;
Modified: trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/StatusTest.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/StatusTest.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/StatusTest.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -126,17 +126,17 @@
@Test
public void shouldBeAbleToPrintWithMessageAndNullException() {
- new Status(Severity.WARNING, "the message goes here", null).toString(); //$NON-NLS-1$
+ new Status(Severity.WARNING, "the message goes here", null).toString();
}
@Test
public void shouldBeAbleToPrintWithMessageAndException() {
- new Status(Severity.WARNING, "the message goes here", new RuntimeException("exception message")).toString(); //$NON-NLS-1$ //$NON-NLS-2$
+ new Status(Severity.WARNING, "the message goes here", new RuntimeException("exception message")).toString();
}
@Test
public void shouldBeAbleToPrintWithNullMessageAndException() {
- new Status(Severity.WARNING, null, new RuntimeException("exception message")).toString(); //$NON-NLS-1$
+ new Status(Severity.WARNING, null, new RuntimeException("exception message")).toString();
}
@Test
Modified: trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/UtilsTest.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/UtilsTest.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/UtilsTest.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -55,7 +55,7 @@
@Test
public void shouldHaveCorrectMimetypeForEclipseTextFiles() {
for (String extension : TEXT_EXTENSIONS) {
- String mimetype = Utils.getMimeType(new File('.' + extension)); //$NON-NLS-1$
+ String mimetype = Utils.getMimeType(new File('.' + extension));
assertThat(mimetype, is(TEXT_MIMETYPE));
}
}
Modified: trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/RepositoryTest.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/RepositoryTest.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/RepositoryTest.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -40,13 +40,13 @@
// Constants
// ===========================================================================================================================
- private static final String NAME1 = "name1"; //$NON-NLS-1$
+ private static final String NAME1 = "name1";
- private static final String NAME2 = "name2"; //$NON-NLS-1$
+ private static final String NAME2 = "name2";
- private static final Server SERVER1 = new Server("file:/tmp/temp.txt", "user", "pswd"); //$NON-NLS-1$ //$NON-NLS-2$ //$NON-NLS-3$
+ private static final Server SERVER1 = new Server("file:/tmp/temp.txt", "user", "pswd");
- private static final Server SERVER2 = new Server("http:www.redhat.com", "user", "pswd"); //$NON-NLS-1$ //$NON-NLS-2$ //$NON-NLS-3$
+ private static final Server SERVER2 = new Server("http:www.redhat.com", "user", "pswd");
private static final Repository REPOSITORY1 = new Repository(NAME1, SERVER1);
Modified: trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/ServerTest.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/ServerTest.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/ServerTest.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -40,14 +40,14 @@
// Constants
// ===========================================================================================================================
- private static final String URL1 = "file:/tmp/temp.txt"; //$NON-NLS-1$
- private static final String URL2 = "http:www.redhat.com"; //$NON-NLS-1$
+ private static final String URL1 = "file:/tmp/temp.txt";
+ private static final String URL2 = "http:www.redhat.com";
- private static final String USER1 = "user1"; //$NON-NLS-1$
- private static final String USER2 = "user2"; //$NON-NLS-1$
+ private static final String USER1 = "user1";
+ private static final String USER2 = "user2";
- private static final String PSWD1 = "pwsd1"; //$NON-NLS-1$
- private static final String PSWD2 = "pwsd2"; //$NON-NLS-1$
+ private static final String PSWD1 = "pwsd1";
+ private static final String PSWD2 = "pwsd2";
private static Server SERVER1 = new Server(URL1, USER1, PSWD1);
Modified: trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/WorkspaceTest.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/WorkspaceTest.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/domain/WorkspaceTest.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -40,13 +40,13 @@
// Constants
// ===========================================================================================================================
- private static final String NAME1 = "name1"; //$NON-NLS-1$
+ private static final String NAME1 = "name1";
- private static final String NAME2 = "name2"; //$NON-NLS-1$
+ private static final String NAME2 = "name2";
- private static final Server SERVER1 = new Server("file:/tmp/temp.txt", "user", "pswd"); //$NON-NLS-1$ //$NON-NLS-2$ //$NON-NLS-3$
+ private static final Server SERVER1 = new Server("file:/tmp/temp.txt", "user", "pswd");
- private static final Server SERVER2 = new Server("http:www.redhat.com", "user", "pswd"); //$NON-NLS-1$ //$NON-NLS-2$ //$NON-NLS-3$
+ private static final Server SERVER2 = new Server("http:www.redhat.com", "user", "pswd");
private static final Repository REPOSITORY1 = new Repository(NAME1, SERVER1);
Modified: trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClientTest.java
===================================================================
--- trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClientTest.java 2010-01-04 20:11:45 UTC (rev 1522)
+++ trunk/web/dna-web-jcr-rest-client/src/test/java/org/jboss/dna/web/jcr/rest/client/json/JsonRestClientTest.java 2010-01-04 20:58:21 UTC (rev 1523)
@@ -52,21 +52,21 @@
// ===========================================================================================================================
// user and password configured in pom
- private static final String PSWD = "password"; //$NON-NLS-1$
- private static final String USER = "dnauser"; //$NON-NLS-1$
+ private static final String PSWD = "password";
+ private static final String USER = "dnauser";
- private static final Server SERVER = new Server("http://localhost:8080", USER, PSWD); //$NON-NLS-1$
- private static final String REPOSITORY_NAME = "dna:repository"; //$NON-NLS-1$
+ private static final Server SERVER = new Server("http://localhost:8080", USER, PSWD);
+ private static final String REPOSITORY_NAME = "dna:repository";
private static final Repository REPOSITORY1 = new Repository(REPOSITORY_NAME, SERVER);
- private static final String WORKSPACE_NAME = "default"; //$NON-NLS-1$
+ private static final String WORKSPACE_NAME = "default";
private static final Workspace WORKSPACE1 = new Workspace(WORKSPACE_NAME, REPOSITORY1);
- private static final String WORKSPACE_PATH = "/myproject/myfolder/"; //$NON-NLS-1$
- private static final String FILE_PATH = WORKSPACE_PATH + "document.txt"; //$NON-NLS-1$
- private static final String BINARY_FILE_PATH = WORKSPACE_PATH + "picture.jpg"; //$NON-NLS-1$
+ private static final String WORKSPACE_PATH = "/myproject/myfolder/";
+ private static final String FILE_PATH = WORKSPACE_PATH + "document.txt";
+ private static final String BINARY_FILE_PATH = WORKSPACE_PATH + "picture.jpg";
- private static final String WORKSPACE_UNUSUALPATH = "/myproject/My.Test - Folder/"; //$NON-NLS-1$
- private static final String FILE_UNUSUALPATH = WORKSPACE_UNUSUALPATH + "Test File_.a-().txt"; //$NON-NLS-1$
+ private static final String WORKSPACE_UNUSUALPATH = "/myproject/My.Test - Folder/";
+ private static final String FILE_UNUSUALPATH = WORKSPACE_UNUSUALPATH + "Test File_.a-().txt";
// ===========================================================================================================================
// Fields
@@ -103,7 +103,7 @@
@Test
public void shouldNotUnpublishNonexistentFile() throws Exception {
- File file = new File("bogusfile"); //$NON-NLS-1$
+ File file = new File("bogusfile");
Status status = this.restClient.unpublish(WORKSPACE1, WORKSPACE_PATH, file);
if (status.isError()) {
14 years, 4 months